Dennis Publishing All Rights ReservedB%PC Pro Real World Computing on CD-RomD
Issue
A View PicsB
Records Found:T
Issue
input
input-output
input/output
inputsa
inputs/outputsa
inputstreamreader
inputting
inqmsg
inqrsp
inquiries
inquiry
inroads
ins's
insane~
insanely
insanity
inscrutable
insect
insects
insecureC
insensibility
StartB
CategoriesB
.This Database contains every Real World Computing article from the last five years of PC Pro. Click continue to choose a section, or use the search button to search for a specific article or keyword.
The Q+A (Technical Support) section is in a separate database. Click Menu to return to the main menu.
List WITH SECTIONB
Categories CopyB
Records Found:T
Issue
important
importantlyM
importation
imported[
importer
importers
winder's
windfall
winding
windir
windowD
Jon HoneyballD
The big change?
Afew weeks ago, I attended the launch of Windows Server 2003 in London, and what a fascinating experience it was. It's clear that Microsoft really, really wants us to believe that this is the best, most stable, bomb-proof version of Windows ever released, and that we should be rushing out to upgrade for that reason alone. While this is a fairly predicable ambition for such a keen marketing team, it also indirectly illuminates a rather scary statistic that underlies their ardour; namely, th
Z} 8H
Helvetica
Geneva
Courier
Times
Arial
D ITCKabel Demi
Georgia
Times New Roman
Type-Face
TypoUpright BT
MS Sans Serif
Courier New
another
directive
incorporated
yet-to-be-released
yield
yielded
yielding
yields
ymodem
yocam
yodlee
yonderway
york-based
yorks
yorkshire
yosemite
A%Fill In Number of images doesn't work
= ""
= ""
Previous pic
Print
Paul Ockenden and Mark Newton christen a new column with a lesson in scripting language and tell you never to forget the three Rs.L
February 1998P
40 (February 1998)i
WEB BUSINESSj
MULTIMEDIA:VIDEO
Find SERVERSIDE PROGRAMMING
SERVER SIDE PROGRAMMING
Search
Dave's Tip
= ""
= ""
= ""
= ""
= ""
#"Please enter some search criteria"
!Please enter some search criteriaC%
, = "Or"
) = 0
List Layout
Find All
A Full List
Sort By Issue
Enter Search
Next Record
Main Layout
Previous Record
) = 1
Image Layout
8"Sorry, no matches were found. Please amend your search"
6Sorry, no matches were found. Please amend your search
Set number of images = 0
= ""
Clear Search Fields
Credits
Unlock
Search Field 1
Fill In Section
Find CODE WORKSHOP
CODE WORKSHOP
&A Find UNIX
Find GRAPHICS/PUBLISHING
Publishing/Graphics
Find MOBILE COMPUTING
MOBILE COMPUTING
Find ON LINE
ONLINE
Find NETWORKS
NETWORKS
Find LEGAL
LEGAL
Find BACK OFFICE
BACK OFFICE
Find MULTIMEDIA AUDIO
Find APPLICATIONS
APPLICATIONS
CATEGORY LIST
Category Menu
Search Field 2
Search Field 3
Search Field 4
MULTIMEDIA:AUDIO
Unformat Title, strap
Strap LowerCase
7A Copy Text
Paste Text
Flush cache
imagenumber to 1
Search And mode
" " &
"") &
" " &
"") &
" " &
?A*Go To Layouts ----------------------------
@A.Configuration don't touch---------------------
AA-Searching stuff------------------------------
BA,Find categories-----------------------------
CATEGORY LIST from Menu
Find MULTIMEDIA VIDEO
MULTIMEDIA:VIDEO
Find WEB BUSINESS
WEB BUSINESS
Find ADV WINDOWS
ADVANCED WINDOWS
Credits
*Go To Layouts ----------------------------
Main Layout
Category Menu
CATEGORY LIST
CATEGORY LIST from Menu
List Layout
Full List
Next Pic
Print
Find All
Sort By Issue
Next Record
Previous Record
Image Layout
$,Find categories-----------------------------
Find ADV WINDOWS
Find APPLICATIONS
Find BACK OFFICE
Find CODE WORKSHOP
Find GRAPHICS/PUBLISHING
Find LEGAL
Find MOBILE COMPUTING
Find MULTIMEDIA AUDIO
Find MULTIMEDIA VIDEO
Find NETWORKS
Find ON LINE
Find SERVERSIDE PROGRAMMING
> Find UNIX
Find WEB BUSINESS
B-Searching stuff------------------------------
Clear Search Fields
Search
Enter Search
Search Field 1
Search Field 2
Search Field 3
Search Field 4
Search And mode
T.Configuration don't touch---------------------
imagenumber to 1
Flush cache
Paste Text
\ Copy Text
Strap LowerCase
Unformat Title, strap
Previous pic pic
gMKSXZ
ZPCBVv
u]ZZ\
^KJIJ
hGC[qu
yMJPq
q_SOMMLJ
ILSaw
XAWf[LC@??><:
bQqeL:2-../,)$!
&))$"
!'"#$)/9Lo
PAZQ=/(%&((
cSmS9)
SCWB.!
xOiK3!
b@T<)
qS\9
QF0Ru
e4.Ju
D46Ux
|7--U
\H*Mt
I;">]
n]`^u
o`^^bbcbc
b[VRSZ]l
od]YT
TUWZ]_chy
lZME<
?GNU^k
`OE>7
:AGNU`
nD78=AKRVWW
VPF>84/$
#:K`z
dY]biz
rf\WL7
!5DWo
iQFJNTbkoqq
pi\QIE=,
*7FYn
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi}
\A93/-+))((
2<FTdw
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
oaWPJGEC
CDFIM
zsmjhgff
ghjmq
gMKSXZ
ZPCBVv
u]ZZ\
^KJIJ
hGC[r
sttsttsrr
yMJPq
raWSPMJHH
HJLLKMLIGGHJKJ
HIKMKH
ILSaw
XAWf\OGDB?<::99::<
?>;9::<><:;>?>:
bQqeM=:995/(##
%+0/-0/'"!%,
%&,1,%
%('#"
!G"#$)/9Lo
PAZQ>2122/(!
%*)'+)!
cSmS9*%*<]YR@
SCWB."
%7TPH7
9oH'pS
xOiK3!
b@T<)
z{BdF
,qK!2tb$#Z
~GO?'
ij;V=
*bB!0eV"#P
qS\9
QF0Ru
e4.Ju
D46Ux
|7--U
\H*Mt
I;">]
n]`^u
`cddcbba
b[VRSZ]l
od]YT
TUWZ]_chy
lZME<
?GNU^k
`OE>7
:AGNU`
nD78=AKRVWW
VPF>84/$
#:K`z
dY]biz
rf\WL7
!5DWo
iQFJNTbkoqq
pi\QIE=,
*7FYn
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi}
\A93/-+))((
2<FTdw
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
oaWPJGE
CDFIM
zsmjhgff
ghjmq
L//#TLA
,:NLL
20//&M
angus@
david@
honeyball@
kevin
partner@
ockenden
lynch@
odyssey@
advanced@
archiving@
back@
beyond@
calling
chaos@
check@
cred@
delphi
doesn't@
eerie
installation@
essential
experts
extending
illustrator@
flow@
fooled
gamut
gets@
habits@
half-full
netiq@
include
upgrades@
junkyard
knowledge@
lingos@
looking@
making
multimedia
master@
model
mail@
mom's@
improved
flash@
creature@
currently@
customised@
data-handling@
decades@
deeper@
defining@
deja@
deregulatory@
designer@
desperation@
developing@
didn't@
disasters@
2CGA1
&9JNH8#
&8IMG7"
+//3S
+//3S
+//3S
3657S
3657S
3657S
over the coming months.
Listing one
<HTML>
<HEAD>
<TITLE>Text with incremental size increase</TITLE>
</HEAD>
<BODY BGCOLOR=#FFFFFF>
<% for i = 2 to 7 %>
<FONT SIZE=<%= i %>>PC Pro</FONT><BR>
<% next %>
</BODY>
</HTML>
Listing two
#!/usr/bin/perl
print "Content-type: text/html\n";
print "\n";
print "<html>\n";
print "<HEAD>\n";
print " <TITLE>Text with incremental size increase</TITLE>\n";
print "</HEAD>\n";
print "<BODY BGCOLOR=#FFFFFF>\n";
for ($i = 2; $index <= 7; $i++) {
rint "<FONT SIZE="; print $I; print ">PC Pro</FONT><BR>\n";
ve no problem getting session variables tracked properly between pages on your site. If, however, you write the link as:
<a href="/fred/html/home.asp">Goto Home Page</a>
and Netscape creates a new session because of its case sensitivity, hence the session values seem to disappear. It
s a mistake that
s very easy to make, and incredibly difficult to find.
The three Rs
s all very well having the latest techno-wizardry on your site, but people won
t visit just because you have a fancy f
orm or some clever animated text effects. You need to spend some time analysing the needs and interests of prospective site visitors, and provide appropriate content. Content is king! A broad rule of thumb is the 'Three Rs' principle. These are: reference, reward and recreation. If you can provide one of these your site will be good, two and your site will be above average, but include all three and your site has the potential to be stunning! We
ll be talking a lot more about the three Rs
ht play with all the latest toys, it
s worth remembering that the introduction of any new technology at the client end will always have an uphill struggle to become the de facto standard, by which time there
ll be another
new kid on the block
An IIS gotcha
We recently had a problem with a Web site hosted on an IIS server, and that makes heavy use of session variables to track what the user
s doing. Users were reporting that the site wasn
t behaving as expected; we eventually traced thi
s to the user clicking on a certain link while using Netscape. How could the choice of browser affect session variables (stored on the server)? The answer was a difference in case sensitivity between IE and Netscape when specifying the server
s virtual directory in the browser URL. If, for example, your site is called Fred, the URL to your root project might be http://servername/Fred/. Create a link to home.asp like this:
<a href="/Fred/html/home.asp">Goto Home Page</a>
then you should ha
gMKSXZ
ZPCBVv
u]ZZ\
^KJIJ
LKKRj
TYLE="position:absolute; left:300; top:25">
Once the design tools are enhanced to let you drag and drop objects anywhere on a page (and they stay there) you
re finally talking about real DTP for the Web, instead of the battle to fix objects where you want them using tables, invisible GIFs and all those other sneaky tricks. FrontPage 98 is currently in beta, and already has some support for these new commands.
It may be a while before Dynamic HTML is universally used, but most of its new co
mmands seem to fail gracefully on the older browsers, so with care you can use them to enhance a site without excluding too many users. This is an important consideration, as many large corporations don
t let employees change their browser at will, so some of the biggest names your site is trying to attract may still be using fairly old versions. The Times stated the other day that
60 per cent of companies with more than a thousand employees are still using Windows 3.1
. While journos mig
hGC[rv
yMJPq
raVQMKLMNMMKH
KLKHH
HIKMMKIH
HILLJH
ILSaw
XAWf\NFC?<=?A@@=;
:;>>=;:
:;>??>;:
:;>><:
;=CN_s
bQqeM<774,(-2431*#!
#&-1+%#
" #(1531+"
#$)01(#
"#$)/9Lo
PAZQ>1.0.%!',.-+#
!*/.*%
!&.=Z
cSmS9*#%6Yi
&6mVj
*\\X1
SCWB."
#3_K\
(RRO-
xOiK3!
b@T<)
0uAb5$U
~GO?'
$Dr[!
-f;U1$LsJ
qS\9
{9g g
QF0Ru
e4.Ju
D46Ux
|7--U
\H*Mt
I;">]
n]`^u
o`^^bbcb
cbbacc
dccdd^SNV_d
dcceffedcdeeddcc
b[VRSZ]l
od]YT
TSPOQT
TUWZ]_chy
lZME<
;98:<
?GNU^k
`OE>7
65457
:AGNU`
nD78=AKRVWW
VPF>84/$
#:K`z
dY]biz
rf\WL7
!5DWo
iQFJNTbkoqq
pi\QIE=,
*7FYn
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi}
\A93/-+))((
%2<FTdw
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
oaWPJGEC
CDFIMS\gu
zsmjhgff
ghjmqv~
gMKSXZ
ZPCBVv
u]ZZ\
^KJIJ
LKKRj
hGC[qu
rstv{
yMJPq
q_SPOLIHH
ILSaw
XAWf[LCAA>;::
;=CN_s
bQqeL:34:=6(""
!("#$)/9Lo
PAZQ=/*-370!
!&.=Z
cSmS9)"'6>m
SCWB.!
"3;auT
xOiK3!
b@T<)
+qTWTK
"0oq%
~GO?'
(8et<
-HueG[
.@ti#
+dJKKDo
(8et<
"0bc!
qS\9
o rYo
QF0Ru
e4.Ju
D46Ux
|7--U
\H*Mt
I;">]
vnS;S
n]`^u
o`^^bbcb
aUIDIU`
b[VRSZ]l
od]YT
TRQRT
TUWZ]_chy
lZME<
?GNU^k
`OE>7
nD78=AKRVWW
VPF>84/$
#:K`z
rf\WL7
!5DWo
iQFJNTbkoqq
QIE=,
*7FYn
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi}
\A93/-+))((
%2<FTdw
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
oaWPJGEC
CDFIMS\gu
zsmjhgff
ghjmqv~
gMKSXZ
ZPCBVv
u]ZZ\
^KJIJ
KLKKRj
hGC[qu
st v{
yMJPq
q`SNJIIJH
HKMNMJH
ILSaw
XAWf[MC?<:;<:
:>@A?<:
=CN_s
bQqeL;74+$#(-("
"'23("
"(7BC@5'"
!("#$)/9Lo
PAZQ=/--$
"2<>;/
!&.=Z
cSmS9)!&0?
?nun`3
SCWB.!
=dhaS+
xOiK3!
b@T<)
/JmZ)5KV7ZbNh
SG>5ZbNh(/O
3DnMUN6
~GO?'
/GbQ)5DJ7SXJ\
G<55SXJ\(/N
3CcING0
qS\9
QF0Ru
e4.Ju
D46Ux
|7--U
\H*Mt
I;">]
n]`^u
o`^^bbcb
fc[U[cb[VRSZ]l
od]YT
TRTWZ]_chy
lZME<
?GNU^k
`OE>7
:AGNU`
nD78=AKRVWW
VPF>84/$
#:K`z
dY]biz
rf\WL7
!5DWo
iQFJNTbkoqq
pi\QIE
*7FYn
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi}
\A93/-+))((
%2<FTdw
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
oaWPJGEC
IMS\gu
zsmjhgff
n]NC;4/.16=IWn
gMKSXZ
ZPCBVv
v`_`b
#'((''
aVYZ\
]XPSk
`?EMG@@??==>
?EMJx
`?EMG@@>>
?EMJx
!+<FC@@??==>
!+<FC@@>>
"-Fbrz
~|rR$
"-Fbrz
~|rR$
IED5&
IED5'
*>F>410351-01-
,.1.,
+,-./+
*>F>51221-
,.22.
-253.,
+,-./+
}j_ZWTQOO
OQSSRSSPMNOQRQOPRSROM
}j_ZWTQOON
OQSSRSSPMNOQRQOPRSROM
t@J7"
t@J7"
#1F?1+)/;EI>389,%
%+9BC?:85*%
'2;2'
%+89,%
$%&')1
#1F?1+*3?CA6+%
%)4A@0&
'2?HD5(
$%-:7*%
$%&')1
eX{nTC?=<92+'&%%&(.32032*%$(/2,()/4/(%&&%&(+*&%
%&(-4?Ty
[UznTC?=<92+'&%
%&(.32032*%$(/2,()/4/(%&&%&(+*&%
%&(-4?Ty
3hrjbEML
#Muri
+"7K<.'&,?Sb
gHgu#%((&$$&()&%&(''&$%1Lv}unW][#%((&*=X}@
#&*4Kgu%'(&$
#9$%,R
+"7K<.'%)=a
|t7'$&)'%%&''*5LW~
%&(&%$&('
& (&%##$-FXt
*#%((&$$&()&%&(''&%&(
'&5Ny^'%$
f[w[?.(,>^ZSA
[Yw[?.(,<WRK:
!<pJ*qU
1N=,'$%7Ss
#0;=4*&.9>8./996,#
$0;=51Bz
*6=EW
-8:1%
1N=,'$#*B~
6-7=90+499<J^
W4:6,)19988
;6*#!#2R
$1;=4*&.9>8./996/.5; 91;X
# 'FA
yUsS8$
gQrS8$
2:bj
2al7(<o=2#
SSUHS 2:bj(-n
T,;JrM
+7GInH"2X_6ss
".:Hi(
*Kn5\Wn=n:;h*
2:bj
2al7(<o?>D7;h3%F
(LC/&$$+BX
0HSrv;/;LqxMCU{SH;"2R
UU[R[4JSrvBGz
?R_~b
CQm_*
"b!"#%Fa
(LC/&$##)A}
?O_{[=LkoO~
-<N]vB8E`{NlizUzRSuB%!#4S
[1JSrv;/;LqxMCU{USXPT uJ
"E!"#%Fa
YkV6
z{BdF
,qK!2tb$#Z
RjU6
ij;V>
*bB!0eV"#Q
:Q5($##,I
! "$Jp
:Q5($#"")B
8+-Hp
!C"$Jp
rZe?#
aWd?#
8'SE.%#""+G
"!&Ms
8'SE.%#"!")B
!D"!&Ms
.W:)#"!#(>
0")6Q
"!2St
.W:)#"!""
"!2St
yPf?!
6T4(##!!$3i
!a #Ggh
6T4(##!!")@x
,?HIP
!D #Ggh
2O1'##!!"(7
2O1'##
!G':q
($"!"!#'7
K2($"!"!$-`
e4.Ju
85"#!
K.;+"J
85"#!"##0N
!"!4G
K.;+"J
|7--U
n]`^u
`G!Bf
b `cddcbbabb
b[VRSZ]l
9eURV\adghi
m*ljhc`blmt
9eURV\adghij
ljhc`blmt
m_RLFE
GMU^fv
m_RLFE
GMU^fv
od]YT
TUWZ]_chy
wlYPHD
CHMSX`s
wlYPHD
CHMSX`s
n[ME=>
?GNU^l
{vhc__]\
]`dgijp
{vhc__]\^^
]`dgijp
dRF?89
:BHOVa
f)=Uj
f)=Uj
zY04O_u
zY04O_u
nD78=AKRVWW
VPF>84/$
#:K`z
eY]bj{
sf\WL8
$:K`z
|UHKNUclprr
qi]RJF>.
tN<DHJM
MH@5-+2@S_o
tN<DHJM
MH@5-+2@S_o
{VD91.+('
&3ER`s
$(0@S_o
$(0@S_o
}YLD>:854
1+& &3ER`s
%)1@S_o
%)1@S_o
|WH>840/.--,
&3ER`s
xojgee
gilqz
xojgee
gilqz
wmea^\
\]`djr~
xojgee
xojgee
gilqz
wmea^\
\]`djr~
xojgee
gilqz
xojgeed
gilqz
wmea^\[
\]`djr~
d page-break-after (at long last giving us some control over how HTML pages print out.)
You can redefine say <H1> to be any style you like, just as you can with <strong>, <B> or any other tag. Imagine the damage you can do with this. If you want to change all headings to a different colour, then by simply redefining one style, all the text with that tag will change within that page. Or, if you use linked style sheets, the change will be propagated to all other pages using that linked styl
e sheet.
Style sheets make the design and maintenance of larger sites easier, and more importantly they introduce extra effects to basic HTML. One new feature Web designers will all love (and probably curse at times) is absolute and relative positioning. You can now place any object at an exact position on the browser page. The code for text works something like this:
<div STYLE="position:absolute; left:150; top:25">
This is some text
</div>
and like this for images:
<IMG SRC="image.gif" S
zl_VLGEGKS_m
cJ<6<FQ\gpvslbUI>6;Tt
lN>0#
+8F]o
hP>.%
"*8Gd
Y@7FZ|
)8GR[`^\VKA@BTo
)8GR[`^\VK?516Jn
]A:Mx
*Hbqsrqpponnopqqnj]\j
*Hbqsrqpponnopqqni
:dsteP5%
!/H]mkir
:dsteP5%
!/H]mhURu
6dvjO'
5Uir~
6dvjO'
5Ug\]
.9Sg4
1U]Gb
.9M[1
1PTDW
{Woj,
z./Ux
f(1Wx
|4$K}
K#7[{
{4 ?`
z=!4f
\$'Jg
R("Jr
]1 5^
H 'E^|
^8!+Pq
R$ :Rm
^8"'Hi
<%B`|
Y5"'Fe
='@\u
rI- )Gd
r6)A[t
tQ5$ /Lf
Q**E]t
"7J]s
wI3?^w
~gI3% &9Tl
T1(4Mbx
uZ@<Mm
{vux}
*>O`u
wN3,3EVgv
}sfWD2(!!'8Ncu
oYP[h
aE.)3GZj
oX@37AVi|
uaJ6)
)9HUfy
dE4(%$$
$##"#(3ATcs
s^KGGKRZ_d
e_YQG</,/<M[iy
p\J7/,,/257
751.*% !%0>ITap
}jXLA93/,++,05;CLYdp{
qaPE<4.+(''(+06=FQ[fp}
uh[N@70)%"
'+18AIRZdq~
|wromklmorw}
~vqlhecabcehlrx
ume_ZWSQPNOPQSW[`gpx
gMKSXZ
ZPCBVv
gMKSXZ
ZPCBVv
gMKSXZ
ZPCBVv
gMKSXZ
ZPCBVv
gMKSXZ
ZPCBVv
gMKSXZ
ZPCBV
u]ZZ\
u]ZZ\
u]ZZ\
u]ZZ\
u]ZZ\
u]ZZ\
^KJIJ
LKKRj
^KJIJ
LKKRj
^KJIJ
LKKRj
^KJIJ
LKKRj
LKKRj
^KJIJ
LKKRj
hGC[qu
yMJPq
hGC[qut
yMJPq
hGC[qu
yMJPq
hGC[qv
hGC[q
yMJPq
hGC[qu
yMJPq
q`TPLJIHH
H"ILSaw
q`SMJIJJKH
H"ILSaw
q`UOKIKMLH
G$GHHILSaw
q`USQOMIH
HJLLI
H!ILSaw
q_RNMMLIH
H!ILSaw
q`SPMKJHH
ILSaw
XAWf[MDA>;
;=CN_s
XAWf[MC>
<:;<=:
:";=CN_s
XAWf[ME@=:<?>:
:!;=CN_s
XAWf[MFDDA>;:
:<>>;:
;=CN_s
XAWf[L
:!;=CN_s
XAWf[MCA?<;::
;=CN_s
bQqeL;7:960)#"
%39+"
"#$)/9Lo
bQqeL<72(#"%15'
!#"#$)/9Lo
bQqeL;8;2,:3$
!+"#$)/9Lo
bQqeL<77885.'#"##""!"##
$*1/($
!""#$)/9Lo
bQqeL:1-/8;6("
!)"#$)/9Lo
bQqeL:47862,%"!!"+92$
!*"#$)/9Lo
PAZQ=0.220)"
!&.=Z
PAZQ=1-+"
!&.=Z
PAZQ=0/3+
!&.=Z
PAZQ=1-/22/(!
!&.=Z
PAZQ=/'
%)150!
!&.=Z
PAZQ=/*/10,&
!&.=Z
cSmS9)"*Efcb5
cSmS9)")F/
cSmS9)"+BlE
cSmS9*#'/5h
cSmS9)
cSmS9)
!2\cbS
SCWB.!
&@\YX0
SCWB.!
SCWB.!
SCWB."
"+3\t[
SCWB.!
2:atT
SCWB.!
/TYYK
xOiK3!
xOiK3!
xOiK3!
xOiK3!
xOiK3!
xOiK3!
b@T<)
b@T<)
b@T<)
b@T<)
b@T<)
b@T<)
;Zm>a(9r
Je")m^Ky*$0Z
?m.+F
/V[*JJ+c
)i]!0or+!8
#kNVL";td5,
+rSVSK
"0or%
7zFa@h_(ZZRw-
~GO?'
7R`8U(7eu@
)`TEh%$0Rt^
;^*+BqI
~GO?'
(8et>
,NP*DB+Z
~GO?'
)\Q!0bd'!8
~GO?'
!0`p8
#^FKC"7dW1,
~GO?'
(7et<
-HueH\
.Ati"
+eKLKDo
(7et<
"0bd!
~GO?'
~5l?U<\S(RQLi)
qS\9
qS\9
qS\9
qS\9
qS\9
qS\9
h5?t
o rYo
QF0Sv
QF0Rv
QF0Rv
QF0Qw
QF0Rw
QF0Ru
e4.Ju
e4.Ju
e94.Ju
e4.Jr
e4.Ju
e4.Ju
D46Vy
D46Uy
D46Uy
D46Sx
D46Ux
D46Ux
|7--U
|7--U
|7--U}
|7--U
|7--U
\H*Mt
\H*Mt
\H*Mt
\H*Mr
\H*Mt
\H*Mt
I;">]
I;">]
I;">]
I;">[
I;">]
I;">]
n]`^u
n]`^u
n]`^u
n]`^u
vnS;S
n]`^u
n]`^u
o`^^bbcb
e"b[VRSZ]l
o`^^bbcb
e"b[VRSZ]l
o`^^bbcb
b[VRSZ]l
o`^^bbcabbcbccbaabcc
deedc
e"b[VRSZ]l
o`^^bbcb
eaUIDIU`
e"b[VRSZ]l
o`^^bbcb
b[VRSZ]l
od]YT
U$TUWZ]_chy
od]YT
U$TUWZ]_chy
od]YT
TUWZ]_chy
od]YT
U$TUWZ]_chy
od]YT
TRQRT
U$TUWZ]_chy
od]YT
TUWZ]_chy
lZME<
="?GNU^k
lZME<
="?GNU^k
lZME<
lZME<
="?GNU^k
lZME<
="?GNU^k
lZME<
GNU^k
8":AGNU`
`OE>7
8":AGNU`
`OE>7
8":AGNU`
`OE>7
8 :AGNU`
8":AGNU`
`OE>7
:AGNU`
nD78=AKRVWW
>84/$
#:K`z
nD78=AKRVWW
VPF>84/$
#:K`z
nD78=AKRVWW
VPF>84/$
#:K`z
nD78=AKRVWW
VPF>84/$
#:K`z
nD78=AKRVWW
VPF>84/$
#:K`z
nD78=AKRVWW
VPF>84/$
dY]biz
rf\WL7
!5DWo
dY]biz
rf\WL7
!5DWo
dY]biz
rf\WL7
!5DWo
dY]biz
rf\WL7
!5DWo
dY]biz
rf\WL7
!5DWo
dY]biz
rf\WL7
!5DWo
iQFJNTbkoqq
pi\QIE=,
*7FYn
iQFJNTbk
pi\QIE=,
*7FYn
iQFJNTbkoqq
pi\QIE=,
*7FYn
iQFJNTbkoqq
pi\QIE=,
*7FYn
iQFJNT
bkoqq
pi\QIE=,
*7FYn
iQFJNTbkoqq
pi\QIE=,
*7FYn
{VD91.+('
&3ER`s
{VD91.+('
&3ER`s
{VD91.+('
&3ER`s
{VD91.+('
&3ER`s
{VD91.+('
&3ER`s
{VD91.+('
&3ER`s
rRG@;863322
#.?KWi
rRG@;863322
#.?KWi}
rRG@;863322
#.?KWi}
rRG@;863322
#.?KWi}
rRG@;863322
#.?KWi}
rRG@;863322
#.?KWi}
\A93/-+))((
%2<FTdw
\A93/-
+))((
%2<FTdw
\A93/-+))((
%2<FTdw
\A93/-+))((
%2<FTdw
/-+))((
%2<FTdw
\A93/-+))((
%2<FTdw
wmea^\
\]`djr~
wmea^\
\]`djr~
wmea^\
\]`djr~
wmea^\
\]`djr~
wmea^\
\]`djr~
wmea^\
\]`djr~
ylc\XVT
TUW[`hs
ylc\XVT
TUW[`hs
ylc\XVT
STUW[`hs
ylc\XVT
TUW[`hs
ylc\XVT
TUW[`hs
ylc\XVT
TUW[`hs
oaWPJGEC
CDFIMS\gu
PJGEC
CDFIMS\gu
WPJGEC
CDFIMS\gu
oaWPJGEC
CDFIMS\gu
aWPJGEC
CDFIMS\gu
oaWPJGEC
CDFIMS\gu
zsmjhgff
ghjmqv~
smjhgff
ghjmqv~
zsmjhgff
ghjmqv~
zsmjhgff
ghjmqv~
zsmjhgff
ghjmqv~
zsmjhgff
ghjmqv~
erhaps more records will be sent than the user actually needs, or on the server, where the server generates HTML pages dependant on the user
s request. This second method puts a greater load on the server
s CPU, but in many cases uses less network bandwidth. If used sensibly, that is by only sending small numbers of records to the data object - and definitely not your stock list of 1,000,000 items - it should become a useful tool for designing Web sites that talk to data sources, and we e
xpect to see interesting implementations using this object before long.
Putting on the style
A huge change that Dynamic HTML brings involves Cascading Style Sheets (CSS.) These are something like the styles you
ll be familiar with from Word or DTP programs; they give documents a coherent look and let you make global changes throughout your whole site by making a change in one place. Within each style you can change font, colour, borders, cursor type and, most happily, page-break-before an
The main things to bear in mind when evaluating Dynamic HTML commands is that they produce their effects locally in your browser, without needing another call to the server. But all those hidden objects still need to be loaded by the browser, so the pages can take longer to load than their appearance would suggest.
A handy object called the Data source object which acts mu
ch like the data control in VB, when placed on a page enables the user to scroll through records, sort and update them, and all this is done on a locally stored copy of the data, taking some of the load off the server. You can achieve this sort of effect using different database cursors under Active Server Pages, but the data source object takes it a bit further.
As for scripting, there are different views as to whether this sort of data handling is best done on the client machine, when p
ll see a version of Dynamic HTML that will be supported by both major browsers. That has to be good news. In which case, is it worth the effort of learning Dynamic HTML? Having played with it for a while now - and we mean
played
, not
, as we
ve yet to use DH for a real site - it
s become obvious that it offers some very useful extensions.
Dynamic HTML lets you manipulate images from the browser code using tempting tags like Shadow, Blur, Glow and
Alpha Transparency, as well as move and scale images on the fly, which should call forth a whole new generation of badly-designed and distracting sites. It offers many useful animation and graphical effects: for example, you might have just a heading showing, but when you click on it the full text appears underneath. Text and images can change whenever the mouse pointer passes over them, or when they
re clicked, all without needing a script. All the objects on a page will support onmouseo
rdct}
djigp
vxj_k
e`elkfmxw]dy
the Document Object Model, as described by the W3C Preliminary Requirements Document for the Document Object Model. As a result, Netscape Navigator exposes few page elements as objects, limiting layout and creative capabilities for authors. Page elements can be manipulated only while the page is loading, not after load time. Netscape
s nonstandard implementation consists of JavaScript Accessible Style Sheets (JASS), layers and dynamic fonts (TrueDoc). These technologies are not supported i
n Microsoft Internet Explorer 4.
Netscape: (via email)
Yes, we do have a Document Object Model, we are working closely with the W3C in this working group and helped write the preliminary requirements doc. There are aspects that they cover right now, aspects they don
t, and the same for us. There is not even a working draft specification at this time, but we do fully plan on implementing the standard DOM specification once it exists.
Dave Bottoms, client product PR specialist Netscape C
{dcbt
{dcbt
}c`aejhgu
ua`bhjeg{
ua`bhjeg{
%#! 4
vxj_k
xa`flhy
yb`f}
j`qxy
vakwy
f`agi
yUMZ`aw
nRN]f
_NV`as
bijde
cripts to work with all the common browsers, and keep testing those scripts as new browsers are released.
Dynamic fantastic?
Dynamic HTML is a buzz phrase right now, but hot new technologies for the Web seem to appear every five minutes, and Web developers need to weigh up whether the time it takes to learn a new technology will pay dividends. But its sheer size and influence make Microsoft's products always worth investigating, particularly if, as in the case of Dynamic HTML, the W3C con
sortium plans to ratify it. (W3C is the body with the unenviable job of bringing standards to the HTML language - in our book anyone doing anything to reduce differences between browsers deserves a medal!) Even so, Dynamic HTML is only supported by IE 4 just now, though Netscape is implementing it, but with a difference. Rather than taking sides in the browser war, let
s just see what the two companies have to say:
Microsoft: (from the Web site)
Netscape
s implementation does not support
"`q[
}{yww
j`qxy
yddnv
rdct}
~zyww~
ndba``i
l`bhhm
yUMZ`~aw
uaPPX_k
bNN^f
m``diecp
m``bhh_u
i[PP]dy
teba``e
h makes it fine for intranet sites that distribute a single target browser, but pretty useless for commercial Web sites. Having said that, it
s amazing how many Web sites are designed to work only with Netscape (or IE), which from a marketing point of view is crazy! Imagine a company producing a TV commercial that could only be viewed on Sony sets?
Microsoft has recently confused things even more with the introduction of Scriptlets. At first this technology looks interesting, but scrape aw
ay the gloss and you
ll see that scriptlets are little more than client-side include files, and at the moment they
re only supported by IE 4. To continue the TV ad analogy, scriptlets restrict the audience not just to people with Sony sets, but to those with a particular model!
To summarise our advice on scripting: 1) Keep it on the server wherever possible. 2) On the server, try to use integrated and supported scripting facilities. 3) For client scripting use JavaScript, but design your s
nRN]f
}c`aejh_u
hPN_`m
yUN]dy
UMV`j
dji`p
cjhcl
dji`p
yUN]dy
ls like Mime type headers, line breaks and the correct use of semi-colons. It should be obvious even from this simple example that Active Scripting provides a cleaner solution, easier to develop and maintain.
Client reliant
On the client side, the choice is between JavasScript/JScript and VB Script. JavaScript has the advantage that it
s supported by both IE and Netscape, or rather it should have that advantage: in truth there are certain things that work in IE
s JScript but not in Netsca
s JavaScript and vice versa. Then differences between versions of each browser muddy the waters even more. Take the code in Listing three (see Code) for example.
Manipulating an object in toolbar.asp from some JavaScript in toc.asp will work just fine in Netscape 3 but fails miserably in Netscape 4. Why? Because
toolbar
becomes a reserved property name in Netscape 4! Who knows how many sites there are out there with a now-broken frame called
toolbar
Only IE supports VBScript, whic
nRN]f
}c`aejh_u
$"ps#
{dcbt
ua`d{
}c`aejh_u
ua`bhje`{
l`bq|na
{dcbt
vdcry
rdct}
bijdm
cjhci
|uh`j
bijdm
pgxwcar
imxw]dy
but we first reported this multiprocessor problem to Activeware more than a year ago, and there
s still no sign of them even acknowledging it, let alone fixing it. A similar problem with ASP scripting was reported to Microsoft and got fixed within two weeks. It
s also important to bear in mind that Microsoft
s ASP and Netscape
s server-side JavaScript are both vendor-supported technologies, whereas most Perl implementations aren
Another nice feature of Active Scripting is that the HTML
and scripting code are mixed in the same file. Consider the example in Listing one (see Code). This script prints the text
PC Pro
in fonts that increase in size from two to seven. This is a normal HTML page with three small bits of VBscript embedded between the <% and %> tags. Listing two (see Code) shows how you can do the same thing in Perl.
In Perl things are the other way round; the HTML page is embedded within the script, inside
print
statements, and we have to take care of detai
Pro 3.0 - 4.0F!
Pro 5.0 - 6.0
d%Fill In Number of images doesn't work
Set number of images = 0
Fill In Section
Unlock
Next Pic
cator, right through to folk accessing the Web from a Psion Organiser or other Web-enabled PDA.
Mind your language
Choosing a scripting language is slightly more complicated. When writing server-side scripts you may be able to use scripting facilities integrated into the server, like Microsoft
s Active Scripting in IIS or Netscape
s server-side JavaScript. Or there
s a traditional CGI scripting language, the most popular of which is Perl.
At first Perl seems to be the obvious choice, as
there
s plenty of example and library code available, and a degree of cross-platform portability. In reality, something like Microsoft
s Active Scripting has a distinct advantage in being far more closely integrated into the server
s environment, which enables you to build a more feature-rich site. Also, we
ve found that the most common implementation of Perl for NT (by Activeware) has serious stability problems, especially when run on multiprocessor servers. Okay, all software has bugs,
evice driver plumbing of the operating system.
**He gets my award for having the honesty and integrity to speak his mind over the lack of driver-signing lockdown in Server 2003. At the London launch event, I asked him why it was that Server 2003 didn't explicitly forbid you from installing unsigned drivers. His response was clear and tinged with a little sadness: 'We didn't have the courage to'.
**It turns out that only a small percentage of the drivers come from within Microsoft, and
it still isn't possible to ensure all drivers are properly tested by the authoring companies and digitally signed. This is, of course, simply the nature of the beast - Windows 2003 has exceptional out-of-the-box hardware support and even greater support from third-party manufacturers, who will release their own drivers (often with enhancements and new features) once the OS has shipped.
**Personally, I wish Microsoft did have the courage to forbid unsigned drivers to be installed, even th
rewalling, I raised the difficult question of port 80 filtering with the Microsoft team. Now that all sorts of traffic is being pushed through this one port, being able to manage what comes in and out through 80 is becoming critically important. This raised a number of smiles and lots of nodding, and I think they have something up their sleeve for later in the year. It was also good to hear that they clearly understand people want to use both hardware firewall appliances like Cisco PIX and
SonicWALL, and software ones like ISA (Internet Security Accelerator from Microsoft) and to deploy these in a layered fashion. I said it would be a great pity if forthcoming work from Microsoft in this field required one to have ISA as the primary boundary firewall, and I was assured this wouldn't be the case.
**Quote of the Month
*The accolade of Quote of the Month goes to Rob Short, corporate vice president of Windows Core technology. He's in charge of the Windows kernel and all the d
od to see Microsoft's latest work, which allows a server to harden itself down, blocking off all ports that aren't required for its functional configuration, and thus encouraging servers to be better protected from the surrounding infrastructure.
**For myself, I've now taken the drastic step of allowing automatic overnight updating of all my core servers and desktops. I know there's a risk associated with this, whereby a 'bad patch' might render a server unstable, but this is a risk I'm
prepared to take - at least for the majority of time when I'm in the UK and thus physically close to the servers. If I can get through several months of this procedure with no major crises, I might well recommend that some clients test it on their non-mission-critical servers too. Even given the hard firewall capabilities that I have in place, it's still possible that something nasty might get through, and this daily updating will hopefully keep these things at bay.
**On the subject of fi
column this month, so I'll stick to a few key points here.
**First, there's no question that this product group is totally focused on making the Windows platform more secure and easier to update. They asked us for feedback on how we thought they were doing, and we raised the obvious issues like Windows Update only updating the OS and not the applications, and that it was a mess to have to go to Office Update for some pieces and to various other places to get other security updates too. It
was made clear that we'll shortly be seeing a single, unified Microsoft Update site covering everything on the machine that comes from Redmond. It might even be able to hook into third-party applications too, but we'll have to wait and see on that front - it would certainly be nice to have a truly one-stop updating shop.
**We also talked about the need to harden servers and desktops within networks, and not to rely solely on the protection given by border defences like firewalls. It's go
ual Studio .NET system and deploy them straight onto 64-bit platforms as native 64-bit applications. No code will need to be changed, providing the project stays within the rules of the virtual world of the .NET Runtime. Calls to external services however, especially to COM objects, will present a new level of problem.
**Server Security
*As you might imagine, for the release of Server 2003 Microsoft has taken the opportunity to talk at as great a length and as often as possible about sec
urity. I was fortunate to attend a meeting hosted by Stuart Okin, chief security officer of Microsoft. His background in Microsoft Consulting Services clearly identifies him as one whose knowledge on his subject area is pretty definitive. We were also joined by Craig Fiebig, general manager of Microsoft's Security Business Unit Product Management Group, and two of his colleagues, Guy Friedel and Zachary Gutt. I know David Moss is going to cover some of those discussions in the Back Office
installed on each machine. In essence, any application can be compiled and run anywhere that the run-time exists.
**This is mostly true, but it ignores the limitations of the various versions of the run-time frameworks available for different platforms. For example, the one that will exist on the smartphone is much smaller and less capable than the one that exists today on the Pocket PC platform, and this itself is only a subset of the one on the full Windows 32-bit desktop platform.
**With 64-bit Windows, we'll be getting a 64-bit .NET Runtime Framework - or, rather, it will be forthcoming but don't bet on it being in the first shrinkwrap. It might well be there as a beta release only, with full release next summer when the next big release of Visual Studio .NET is ready. This is the release that includes all the functionality of Yukon, the forthcoming database engine.
**So, once we get this new run-time, it should be possible to develop applications on a 32-bit Vis
ugh all this code and make it 64-bit clean. You can be assured that the core operating system is 32/64 clean, as Cutler and his core engineering group have ensured that their kernel code remains portable and processor non-specific. However, the rest is an example of the code-rot that we feared would happen once Windows was allowed to become a 32-bit-only, single-CPU family. And incurring The Wrath Of Cutler isn't something to be taken lightly - I'm sure you've heard the infamous story of h
im punching a hole through some wretch's office door and opening it from the inside.
**Just to be clear on this matter, these omissions won't affect the AMD 64-bit platform alone - they hit the Intel 64-bit platform too. Some help is at hand, though, and from a direction about which Microsoft has been rather quiet so far. Visual Studio .NET, as you'll know, compiles applications to Intermediate Language, or IL, which is then compiled and run on the fly by the .NET Runtime Framework that's
t AMD Windows XP and Server 2003? The official answer is currently running at around 'the end of this year', but once you look closely there are some issues that the marketeers would prefer you didn't notice. For example, a significant amount of more advanced 32-bit applications that come in the Windows shrinkwrap just aren't there for 64-bit. The reason? These applications were hard-coded with specifically 32-bit memory assumptions in place and can't benefit from the 'just pop it in the 6
4-bit compiler, crank the handle and watch a shiny new 64-bit application pop out' as promised by the PowerPoint slides.
**Indeed, I've heard that no less a Windows system luminary than David Cutler himself is far from amused to find that these pieces of the puzzle have gone 'missing in action' for the present. For an example of the problem, you'll soon discover that the whole Windows Media environment is hard-coded to 32 bits and won't port easily.
**Basically, Microsoft has to go thro
ll assume that everyone is as clever as them, and when they do provide Wizards it's supposed to be obvious which ones you should use and trust, and which are a little more dodgy (the infamous DCPROMO Wizard in Windows 2000 falls frighteningly into this latter category, of course). And the truth is that your Normal Distribution Curve of users will almost certainly want to wait until something more dramatic than Server 2003 comes along.
**64 Bits of Fun
*Only the most power-crazed and dema
nding of sysadmins will already be looking at the new AMD 64-bit processor, so not surprisingly I'll confess that I'm finding the new 64-bit AMD processors extremely tempting in a way that Intel's Itanium hasn't been so far. The prospect is of a dual-processor motherboard costing just a few hundred pounds, populated with processors, which don't cost much more, offering up a compelling 32-bit platform for today and a rosy 64-bit future after that.
**So when can we expect to receive 64-bi
s silly at the prospect of The Big Change. Hence they're still running NT 4, because no-one has had the time or energy to look at Windows 2000 AD in a serious way, other than by peeking at it from behind the sofa. There's probably much in Windows 2000 and AD that would be of great use to them, but just getting started is a daunting task.
**Finally, there are those who've moved, probably because of the need for Exchange 2000, and yet their network is running in Mixed mode, there are still
a few Backup Domain Controllers from NT 4 on the network, and Windows 2000 AD sysadmin tasks are best tackled after a large gin and tonic or two.
**It never ceases to amaze me that software products seem to be written not for the Normal Distribution Curve of skills possessed by actual members of the public, but for some strange phantasmagoric creature that can trot off IP addresses, do subnet mask calculations in its head and the like. In other words, people who write system software sti
at around half of the installed base of Windows Server is actually still NT 4 Service Pack 3.
**There can only be a few reasons why people are still running NT 4. The first - and most obvious - reason is that it works fine for them, they're not missing any mission-critical features and it will continue to work into the foreseeable future. It's hard to see how these people are going to be persuaded to upgrade any time soon.
**The reality is that NT 4 SP 3 is still a fine example of a bal
anced and focused platform that performs its tasks in a most effective way. Microsoft might eventually get these users to upgrade to Windows Server 2003 running the forthcoming Virtual Server product that it recently purchased from elsewhere, and at least this would allow sysadmins to keep running NT 4 with some semblance of modern driver support underneath the VM subsystem.
**The second group of people are those who've looked at Windows 2000 and Active Directory (AD) and scared themselve
scaryA
scassidy
scathing
scatterF
scatter-gun
scatteredG
scattergunc
scratching]
screams]
screeching\
screenD
screen-readingH
screensE
screenshotF
screenshotsB
screw]
screw-upL
scribblesT
scriptE
scriptingC
scriptsC
scrollN
scsiJ
sealedJ
seamlessE
seamlesslyN
searchG
searchableK
searchedI
searchesI
searchingO
seasonJ
seatT
seatsT
secondA
script would be able to provide the answer without further Net access, whereas a server-based version has to reload the page with the answer inserted.
Forget the theory, though; experience shows there are two overwhelming reasons why you should keep scripts on the server side where possible. First, there
s the matter of code security. With client-side scripting, your widget production cost algorithm is available for all, including your competitors, to see by simply selecting
View Source
. Do you really want other widget producers to know how you calculate your widget pricing? Probably not.
The second reason is browser compatibility. Not all Web browsers support scripting, and even when using suitable browsers like Internet Explorer (IE) or Netscape, some Web users experience abnormally high paranoia levels and turn scripting off. Keeping the scripts on the server side ensures anyone can use your widget calculator, from people running the latest version of Netscape Communi
ough it would cause all sorts of problems. For example, I've recently been looking at some printers from various manufacturers and found to my astonishment that all the current printer drivers I tried on the Hewlett-Packard website are unsigned. What in heaven's name does HP think it's up to? Releasing production drivers in an unsigned state might be excusable for a small software company that hasn't the time, resources or money to go through a full driver test certification suite and sign
ing process, but I don't expect this from a company the size of HP.
**Is it therefore surprising to note that these HP drivers were almost uniformly difficult to install? I failed to get the HP 2500 Color Laserjet driver to install at all on Server 2003. Not only are the drivers unsigned, but the setup routines are hugely complicated symphonies that boggle the mind. Trying to get a DOT4 layered USB driver installed on standard Windows XP SP 1 took most of the morning too.
**DXDiag
re's a very useful little tool that you'll probably find on your computer. I haven't checked backward versions but I know it's on Server 2003 and probably on XP too. It's called DxDiag and it allows you to see at a glance how DirectX facilities are set up on your computer. Using this tool, you can see versions of files, capabilities and so forth, and enable and disable certain functions like Direct3D acceleration and AGP Texture Acceleration too. If you're trying to debug a problem with Di
rectX, this tool might well come in handy.
**It's especially useful if you try to follow the advice found at <A HREF="http://support.Microsoft.com" target="_BLANK">http://support.Microsoft.com</A> under the document number of Knowledge Base Article 311410, entitled 'How to install Plus! for Windows XP in Windows Server 2003'. I'll confess that I had to read that line several times before I could even appreciate what it was saying. Apparently, there exist in this world some people who wan
t to be able to install the Plus! Pack for XP onto a Server 2003 installation.
**This isn't just any old server, no sir-ee. The document states that the following advice is applicable to all versions of Server 2003, including Datacenter Edition. At this point, I was reduced to rocking back and forth in my chair, holding my head in my hands in a desperate attempt to cope with the idea of what bitter, twisted, utterly warped sensibility would want to install Plus! onto any Server 2003, let
alone the Datacenter Edition. I'd hoped that the Microsoft documentation would clearly state you were not to install Plus! onto any Server 2003 machines, under pain of death, but no, the instructions are given step-by-step. I have to quote them here:
**'To install Plus! for Windows XP, type the following command at a Command Prompt, where path is the path to the root of the Plus! for Windows XP installation source: msiexec.exe /qn+ /i \\path\Microsoft Plus! for Windows XP.msi
*'For examp
le, if you're installing Plus! for Windows XP from CD-ROM drive D, type: msiexec.exe /qn+ /i "D:\Microsoft Plus! for Windows XP.msi"
**'You receive a message that states whether or not the installation was successful shortly after you issue the command. After you install Plus! for Windows XP successfully, you can use the Add or Remove Programs tool in Control Panel to add or remove specific Plus! for Windows XP components.
**'Note: DirectX and Direct3D are required for many of the Plus!
for Windows XP components to work properly. DirectX and Direct3D aren't activated by default in the Windows Server 2003 family. To activate DirectX and Direct3D:
**1. Click Start, then Run.
*2. Type dxdiag, and then press Enter.
*3. Click the Display tab.
*4. Make sure that all of the DirectX features are set to Enabled'.<p>
*To be fair, there's a little paragraph stating, 'Microsoft did not design Plus! for Windows XP to run on the Windows Server 2003 family', which is just about as un
derstated as a warning can get. Henceforth, let it be clearly understood that your Server 2003 installations aren't fully and properly configured unless they have Plus! - complete with Amanda the cheerleader and Seth the B-Boy - installed on them. You know it makes sense.
HZJon Honeyball questions the demand for Windows Server 2003 and then juggles 32 and 64 bitsL
August 2003X
106 (August 2003)i
Advanced Windows
Simon JonesD
Bundles of confusion
Microsoft has revealed the various editions of Office 2003 and the applications included in each edition, and at the same time has announced an extension to the beta programme. It can be hard work trying to make sense of all this, but that supposes there's any sense to be found.
**The six editions of Office 2003 are called Basic, Student and Teacher, Standard, Small Business, Professional and the confusingly named Professional Enterprise. These contain various mixtures of the Office appli
focusedA
focuses_
focusingC
foerster
foerster's
foggy
foibles
foist
foisted
foisting
foldd
folded-up
folderB
sofaA
softI
soft-power
soft-sided
soft-swaps
softarch
softartisans
soften
softened
softening
softens
softer
softest
softie
softmodem
softness
softnet
softpc
softsound
londonA
london's
london-based
whichA
creatureA
creatures
crect
credentialsK
credibilityE
credibleL
creditN
credit-card
formM
cross-siteK
crossedW
crotcheta
crowd[
crowing
crownm
ide to run is straightforward: for instance, to animate an image as the user moves their mouse around the screen, you have to use client-side scripting. Similarly, to log the users
responses to a form on your site into a database, you
d almost certainly want to use server-side scripting. Other cases, however, aren
t so clear cut.
Take a
widget production
calculator. This has two input fields, the number of widgets required and how quickly they
re needed, and it outputs the resulting pr
ice per widget. This is a calculation that could run on either the server or the client, and each side has various theoretical pros and cons. Client-side scripting requires the program that calculates the widget cost to be downloaded into the browser, and if this calculation is very complex this could take some time. A server-side calculator, however, only needs to download a standard HTML form for the inputs. On the other hand, once the two input values have been typed in, a client-based
operatingA
london@
london's
london-based
lone@
long-awaited
long-standing
readers
passing@
long-time-coming@
longer
longer-lived@
longhorn
look@
closely
magazines
newsstand
you'll
look-ups@
looker@
looking@
looking
after
backoffice
server
quite
similar
looking-around
looks@
made-up@
magnitude@
mail@
mailed@
main2@
mains-powered@
maintain@
maintains@
makedoc@
makeover@
making@
male@
management@
manager@
manchester@
manual
manually@
manuals@
margins@
mark's@
marketing-speak@
markup@
marries@
mastered@
matches@
materialises@
mattered@
maxtor@
throughA
balancedA
balancer
balancers
balancesW
balancingZ
balch
baldwin
baldwin's
baleful
balefully
balkedw
ballc
ballantines
balletJ
ballgame
balling
ballmerO
ballmer's
balloon
ballooned
balloonstate
ballpark
ballpoint
balls
ballspeed
balmer
baltimore's
band's
band-in-a-box
band-x
recommendA
recommendedW
currentlyA
internetA
didn'tA
ncentrate on how to integrate the Web into marketing and comms strategy, and how to make it work for you.
A long time ago (like two years) Web pages were flat and static. Much like the pages of this magazine, they were made up from a mixture of text and graphics, and the only user interaction was clicking on the underlined text of a hyperlink to move from one page to another. Nowadays Web users expect much more, and so in this first column we look at two of the technologies you can use to
liven up your site: scripting and Dynamic HTML.
Many people are rather afraid of scripting, which is hardly surprising as it
s such a vast area with a lot to understand. Before you write a script, you have to make two very important choices: which scripting language to use, and whether to run it on the server or client side.
First scripts. These can be run on the client side, executed within the Web browser or the server side, executed by the Web server. In some cases the choice of which s
facilitiesA
blockingA
blocklen
blockquote>y
blocksC
blockyW
blogo
blogger
bloggers
blogging
bloggs
bloggs&town
blogid{
blogs
blogtxt{
bloke
blokesY
blood
blood-letting
bloodhound
bloodyp
bloomberg
bloomberg's
bloor
bluetooth-supporting
blunder
blurn
blurbp
blurringn
blursn
blustering
truecode
truecolour
truedoc
truedoc's
truedoc-style
trueimage
truesync
truetype
truetype's
trueupdateE
truevector
truevision
trulyA
David Moss and Jon HoneyballD
Long-trousered OS
Windows Server 2003 represents a marked change for Microsoft, because it has decided to ship a long-trousered version of its server operating system - the long trousers will be needed by the system adopters or, more specifically, the system administrators. Installation is still a doddle, but anyone who hasn't been paying attention will get an awfully big shock when their newly updated system finally reboots.
**The key word here is 'lockdown'. Whereas Windows 2000 Server was wide open and
cations, but, worryingly, there are new Standard and Professional versions of Word, Excel, PowerPoint, Outlook and Access. Apparently, you can only buy the Standard versions as part of an Office 2003 Suite - if you buy any one of these applications on its own you'll get the Professional version. (Note that there are only Standard versions of Publisher, InfoPath and OneNote). It must make sense to someone in marketing...
**Table 1 shows the various components of the different bundles, whil
e Table 2 shows the form of licence required for each bundle. Note that you can't buy the Professional Enterprise Edition over the counter, as it will only be available through a volume-licence agreement. Similarly, you can't buy the Basic Edition except when it's bundled with a new PC.
**So what's special about the Professional versions of the major Office applications? The features reserved for these versions are Information Rights Management (IRM), customer-defined XML Schemas and the
SharePoint Datasheet List control. To be honest, the loss of IRM from the Small Business Edition isn't a big deal for most people - only large corporations that are paranoid about information leakage will ever implement IRM. The SharePoint Datasheet List control is nice to have, but how many small businesses are going to be installing SharePoint Services or SharePoint Portal Server, given that they require Windows Server 2003? Even if they do install it, SharePoint is perfectly usable with
out that particular control.
**The only worrying omission is the removal of customer-defined XML Schemas - this feature is one of the major upgrades for Word and Excel, and confining it to just the top two editions will severely limit its usefulness. What's the point in inventing a universal data interchange format (XML) if not everyone has the applications that can understand it? (Answer: to force people to spend more money.) To be useful, XML ability needs to become ubiquitous, so this
is a very short-sighted step.
**Small companies used to be able to save money off their Office licence bill by buying Office Developer or Office Professional for their IT people and Office Standard for everyone else. The only difference between Office Standard and Office Professional was that Professional came with Access and ordinary users didn't need to be able to create Access databases. If they needed to use an Access database, the IT department could give them the Access run-time m
odule, which was freely distributable if you bought Office Developer.
**Now with Office 2003, you'll notice that Standard will include only the Standard versions of Word and Excel, so you won't be able to use customer-defined XML Schemas. On-the-ball IT departments hoping to exploit this powerful new ability will have to fork out to upgrade all their users from Office XP Standard to Office Professional 2003.
**If you bought Software Assurance with Office XP - you could only get it on t
he Professional Edition - you'll be pleased to hear that Microsoft intends to automatically upgrade you to the new Professional Enterprise Edition, the only Office suite that carries InfoPath. You'll be able to roll out Office 2003 whenever you want. If you didn't take Software Assurance for Office last summer and want to use InfoPath, you'll have to either buy the Professional Enterprise Edition through a volume-licence scheme or buy InfoPath separately.
*Microsoft has repeatedly said tha
t volume-licence customers will no longer be able to purchase cheap upgrades of Office. You either take Software Assurance, effectively paying in advance for the upgrade, or you pay the full price for the next version.
**One much-applauded new application is missing from the tables: OneNote, the new voice note-taking application, won't be sold in any Office package. It will only be available as a standalone application through retail or volume-licensing channels, or else it may come bundl
ed with a new PC. You're definitely not getting this one for free and, if you want it, you're going to have to pay for it.
**Microsoft's press release about the various editions of Office 2003 says: 'Most enterprise customers would get great value out of Microsoft Office Professional 2003 Enterprise Edition, because it comes with all the Office applications, including ... Business Contact Manager.'
**Can we just stop at this point and remind ourselves that the Business Contact Manager a
pplication - which comes with the Small Business, Professional and Professional Enterprise editions - is of absolutely no use to even the smallest businesses, let alone enterprise-level corporate customers. It's strictly a single-user application and can't be extended in any way: you can't even customise its forms.
**If you're a single-person business and you have no intention of ever getting any bigger, you might want to look at Microsoft's Business Contact Manager, but there are far be
tter products out there that will grow with your needs rather than boxing you in. I feel sorry for the programmers at Microsoft who were forced to work on Business Contact Manager. I hope they manage to live it down and their careers don't suffer too much.
**Office 2003 Beta 2 Refresh
*Beta testing of Office 2003 is going so well, according to Microsoft, that it's going to issue another beta version. 'Beta 2 Refresh' will be available 'soon'. No-one is officially saying that this will de
lay the release-to-manufacture date, but this was due to be at the end of May and here we are at the beginning of May talking about a further beta version before we even get a Release Candidate, so I think it's a safe bet that Office 2003 won't be shipping quite as early as Microsoft first thought.
**Simon Marks, Office product manager with Microsoft, told me that, 'With the huge amount of feedback (over 600,000 copies of the Beta distributed) gathered through Beta 2, we were able to tho
roughly test the product in a way that hasn't been possible before - so the primary changes will be significant advances in reliability and performance. There won't be any big changes in functionality, except to add a couple of minor things that hadn't been completed in time for Beta 1.'
**I'm sorry that Office 2003 won't be out on time, but I'm very glad that Microsoft isn't trying to rush it to market before it has been thoroughly tested.
**SQL Server Images
*I had an email from a re
ader asking about storing pictures in SQL Server. Paul Coke said: 'For this project, I'm also going to have to store images in an SQL table. What's the best way to do this?'
**There are two approaches to storing images, including vector graphics, photographs, screenshots and similar material. They should be stored either in the Windows file system with only a UNC file name entered into the SQL table, or else directly into an SQL table in an IMAGE type column.
**The first route has the d
isadvantage that you'll have to take separate backup and security measures for these images, as their data isn't actually contained in the database and so not subject to SQL Server's access security or backup measures. SQL Server's 'Integrated Security' means you can design File System and SQL Server security permissions that are both assigned using the same Active Directory user groups, but this integrated security can't cope with all eventualities.
**Your security requirements may be s
mall, but splitting the security between SQL Server and the file system can still be problematic. Even if all your users should have read, write and delete access to all the records, you may still want to prevent them editing or deleting an image without updating or deleting the corresponding database record.
**If you take this file route, you'll probably want to program the user interface to collect the image from somewhere else and copy or move it into a particular folder. Keeping all
the images in one place makes backup and security easier, and means you can store the full folder path just once and then only have to store the names of the files, not a full path for each.
**Going down the second route means you'll have to create routines for storing and retrieving these BLOBs (Binary Large Objects) to and from the database. How you write these routines depends on the front-end language and data access library you're using. Microsoft's data access libraries such as DAO,
RDO and ADO provide GetChunk and AppendChunk methods to read and write BLOB data, however long it is. For pictures, the usual method is to read all the BLOB data into an array of bytes by repeatedly calling the GetChunk method until it doesn't return any more data. You write this data to a temporary file and finally use the LoadPicture method of an Image or PictureBox control to display the image.
**You should, of course, delete this possibly large temporary file when you're finished wi
th it to preserve resources. Uploading pictures to the database means opening the source file, reading chunks of bytes from it (up to 16KB at a time) and then using the AppendChunk method to write that chunk to the IMAGE column.
**Take care when debugging routines like this, as the chunk pointer, which shows where the next chunk of data will come from, advances forwards only through the file, and it does so every time the GetChunk method is called, however it's called. This can include '
watching' the GetChunk result or even hovering over the GetChunk call with the mouse pointer, so a small slip with the mouse while debugging can mean your image is retrieved mangled or not retrieved at all, and it can take hours to work out what's going on.
**Visual Studio .NET uses a different data access library (ADO .NET) and a new PictureBox control. You can use the PictureBox.Image.Save() method to save a picture to a MemoryStream object, convert it into an array of bytes and save i
t directly in a DataSet, or send it as a parameter to an INSERT or UPDATE query.
**Getting the picture out of the database is the reverse operation. From the DataSet or DataReader, you get an array of bytes from which you create a MemoryStream object. You then assign this to the PictureBox.Image property using the Image.FromStream() method. No temporary files are involved.
**If you're feeling clever, you can encapsulate all this into a Binding object and use that to directly bind a Pic
tureBox to the IMAGE column in a DataSet. This Binding object's Format event passes you the value from the database and you get to mangle it into the format you want for display. See Code Listing One for one way to do this: the Parse event is called for the reverse process where you get to say how the value is to be stored.
**However, you should be careful about the use of images with ADO .NET. If you include the IMAGE column in your main dataset, you'll get the images for all the rows re
turned, which could give you a large overhead for storage on the client and transport over the network. It's usually a good idea to limit the main dataset to characters, numbers, dates and so on, only retrieving the images for specific records when you actually need them. Of course, there are exceptions, such as when your application has to work fully while disconnected from the database, but you should generally be wary about retrieving images and only do it when absolutely necessary.
HeSimon Jones gets involved in some Office politics as he tries to make sense of the different editionsL
August 2003X
106 (August 2003)i
Applications
thereforeA
counterB
counter-action
counter-arguing
counter-arguments
counter-intuitived
counter-intuitively
counter-intuitivenes
counter-offer
counter-productive
counter-sued
counter-suit
counteract
counteracts
counterbalance
counterclaims
countered
countermands
counterpart
counterparts
counterproductive
counters
countervailing
counties
counting
creating
multimedia
projects
about
sharing
there's
creationT
creativeH
creativesF
creativity[
creatorE
deliveries
deliveringH
deliversM
deliverto
deliveryH
distributionA
distribution's
distribution-list
distributions
distributorX
disturbingm
disturbs
ditchE
ditchedy
ditching
includeB
niceA
nielsen's
nifty
nigel
standardA
volume-licensingB
volume-production
volumename
volumesS
voluntarily
voluntary
volunteer
volunteering
volunteers
vomitorium
vonwentzel
voodoo
voodoo2
voodoo3
vopli
vpop3
vrml97
vsapi
vsdata
vsdata95
vsdatant
vsdisco
vsflex
vsmonapi
vsnet
vsnetutils
vspubapi
copiesB
copilot
copine
coping
copious
copiously
copperY
rather
ratified
ratify
ratingZ
ratings
ratioF
rationalJ
rationaleD
rationalisation
rationalised
rationalising
rationally
rationing
ratios
clickA
encouragingA
encroaching
encrusted
encryptV
encryptedU
encrypting
encryptionC
encryption/compressi
encryptors
windowsA
-social
while
winfix
winframe
winner
winners}
wireless
withP
bewildering
array
authoring
tools
available
Paul LynchD
17 inches of power
Size does matter - there's no doubt about that - but in this column I'm most often concerned about whether something is small enough rather than large enough. The various Palm models are a good example: small enough to fit in a jacket or shirt pocket, but not so small that I can't use them. The same applies to the foldable keyboard: it's full-sized, so my fingers fit on the keys - any smaller and it would be disadvantageous. Only the most recent iPAQ and Hewlett-Packard Pocket PC models ha
relatively easy to install and administer with limited knowledge, Windows Server 2003 is locked down from the start. It presents you with the task of opening up just those components you require in order to restore functionality. This is great from a security point of view, but it's a nightmare for the sort of person who got designated system administrator simply because their cubicle was closest to the printer.
**It doesn't matter what your previous settings were under Windows 2000 or NT
. Once you upgrade your server to Windows Server 2003, you'll get Microsoft's minimal settings, which force you to deliberately unfetter most aspects of your system again. You might, for example, decide to fire up Internet Explorer after installation is complete, perhaps to log into your company intranet, or perhaps simply by kicking off Windows Update to see if any updates or security patches are available.
**You'll immediately be greeted by a message box telling you that Microsoft Inter
net Explorer's Enhanced Security Configuration (ESC) is enabled on this server. Before you even contemplate browsing anything, you're forced to enable that access yourself - and this applies equally to internal as well as external sites, as we shall see shortly.
**Effectively, you have to open the holes in your security yourself, because Microsoft has taken the bull by the horns and reacted to the 'insecure' jibes by removing default privileges for just about everything imaginable.
ite frankly, I applaud this move. If anything, I wish Microsoft would have gone further still and insisted that only signed drivers be deployable on the server, but, given the deplorable state of affairs that currently exists, a move that bold would have stopped a huge number of peripherals from working at all. The number of unsigned printer drivers, for example, available for download as the 'latest' driver is quite shocking. While signing a driver is no guarantee of its quality, it's sig
nificant that when I took part in a recent printer test a considerable number of the unsigned drivers failed to install correctly on Windows XP Home. Their signed brethren - found on the CD that shipped with the printer - installed without difficulty.
**Internet Explorer on Windows Server 2003, by default, blocks access to scripting, file downloads, ActiveX controls and the Microsoft VM. A whole slew of other settings that used to be enabled by default are now disabled by default. If you
insist on browsing the Internet from your server, at the very least you'll have to keep adding sites to your Trusted Sites zone - not a great idea - and almost certainly will have to make frequent trips to the Advanced Settings dialog to discover why yet another browser feature has stopped working.
**Microsoft hasn't even shown particular favouritism here: the Windows Update site is included among the Trusted Sites list, but when I fired up the brand-new Group Policy Management Console (
GPMC) and started randomly clicking on entries to see what was under the hood I was quickly warned that if I wanted a certain functionality I'd have to add an item named 'about:security_ mmc.exe' to the Trusted Sites zone.
**This warning and request will appear any time you try and view a report in the GPMC, until you relent and add the item that scripts the expansion and contraction of report pages. And if you choose not to add it, all reports are permanently expanded and you won't be ab
le to collapse sections at all. In the old days, Microsoft wouldn't have given a second's thought to adding this functionality by default and saving you the annoyance. It's an indication of the serious approach the company is now taking to security that it has clearly decided it would sooner irritate than placate.
**As an aside, one of the things you can do within the GPMC is view explanatory dialogs on the function of various settings that you can access in the Administrative Templates a
rea. The first time you do this, a dialog springs up enquiring as to whether you want to add 'about:blank' to the Trusted Site list. This isn't a good idea. The worst that can happen if you don't add it is that you lose the Print and Close buttons from the explanatory text dialogs. Use drag and drop and Notepad if you want to do some printing - it's better than blowing a large hole in the side of your security settings.
**But don't just take my word for it, because this is exactly what M
icrosoft recommends in this instance. Ease of use sacrificed in the face of a security requirement? The new face of Microsoft indeed...
**The new, secure face of Microsoft was in London two days ago in the shape of Craig Fiebig, general manager of the security business unit (part of the product management group) at Microsoft. After a most instructive meeting, a group of us came away with a clear idea of just how Microsoft's people are now focusing their minds on the security issue.
take a simple example, we all know time-to-market is very important when launching a product into one of the most cut-throat industries in the world. This didn't prevent Microsoft grabbing all its developers, stopping their code-developing activities and putting them through courses on writing secure code before allowing them back into their rooms. This changed the way Microsoft managed code - now it can immediately identify the individual who has written a particular line of code. It's a
move I imagine does wonders for concentrating people's minds, in combination with those reports that instant execution is now promised for anyone who fails to implement the new secure code guidelines. Life at Redmond has become very uncomfortable if you don't take security seriously, and people are running code audits to ensure the guidelines are being adhered to.
**Microsoft is well aware of the need to simplify the update procedures for a whole host of its products, and my idea of hav
ing a single site for all updates (issue 103) is to become a reality. I'm not going to claim it was my suggestion in this column that sparked that particular move, though. I pointed out that one limitation of Windows Update was that it didn't tell you about all available patches, just those that affected the software currently installed on your system. I suggested it would be helpful for us to be able to see what else existed in the patch tree. I might be unsure whether I needed a certain
update, but there would have been no such question had I been able to see that installing that update would give me access to other useful/critical patches.
**Only time will tell if that suggestion will also be taken up, although I think that it may well appear in one form or another as Craig Fiebig was most open to suggestion and seemed to like the idea.
**Off to see the Wizard
*I also pointed out to him that I felt the move to total lockdown was a bold one given what it might do to th
e heart condition of the unwary sysadmin. Microsoft clearly feels the same way, as there's a Wizard in the offing - it wasn't ready in time for the shrink-wrap - that will go a long way toward helping people quickly configure their servers as they want them to be without blowing their new found security out of the water. The Security Configuration Wizard looks like being a rather splendid tool that all admins will want to add to their armoury as soon as they can get hold of it, which possi
bly could be by the time you read this column.
**The Wizard enables you to easily set up a system to perform a specified role, or carry out a specific process, in such a way as to only implement those features required without touching anything else. Rather neatly, the Wizard gets its data from an XML file, which means your company can make its own updates to the content, enabling it to accurately reflect implementations within your own business. It's effectively a knowledge base of all t
he items necessary to implement a particular service (port numbers, dependencies, communication protocols, role definitions and so on). Assuming that the service is already installed on a system, it will enable the administrator to activate it secure in the knowledge that this is all that's being activated, and that no inadvertent configuration errors have occurred. It's rather useful that not only does it ensure you get a consistent configuration every time you use the Wizard, but if you
take the time to look at the contents of each page you'll see just what services are required for a server to perform specific roles.
**You could, for example, decide to apply the web server role to a system: as you progress through the pages, you'll see in the pre-selected items on each page just what services are needed to enable a server to perform this role. While an experienced sysadmin would probably have known that they would want a web server to carry out DNS resolution, they migh
t not have known that COM+ is needed for ASP pages. The Wizard ensures this sort of problem never arises, as it knows precisely what's required.
**Should some of these options not meet your approval, you can opt to not install any service, and you can also add services of your own should the need arise - there's plenty of opportunity for customisation, if necessary. On one page, the Wizard even shows you running services that aren't in its knowledge base: each one is selected and will be
left to continue running after the Wizard has done its stuff, but it gives you a chance to see if there's anything of which you don't approve.
**You also get the opportunity to decide whether unspecified services are left running, or whether they should be disabled automatically when the server role is applied. Plus, there are further pages that deal with the ports that will be affected, filtering, encryption and more. This seems like a comprehensive tool as it stands, but it's still in b
eta, so there may well be changes by the time it becomes available. It's a shame it wasn't ready for the shrink-wrap, as I can see this being a highly useful aid that will save a great deal of time.
**The Security Configuration Wizard is just one of a number of tools that will be available to system administrators. Another, which puts in an appearance in the Enterprise and Datacenter Editions, is the Windows System Resource Manager.
**This tool enables administrators to control resources
on both those platforms, specifically memory and CPU, and how they should be distributed among processes, applications and services. It's equally at home dealing with resource distribution among multiple applications as it is handling your chosen resource limits for multiple users on a system using Terminal Services.
**This will get some serious attention, but I'll be willing to bet the one tool that will be grasped with both hands and pored over with fervent interest on all the various
Windows Server 2003 editions is the GPMC I mentioned earlier. This will let you configure all your group policies from one place. It provides you with the capability to back up and restore Group Policy Objects (GPOs), see reports on those GPOs (reports become a little easier to control once you've pacified Internet Explorer's Enhanced Security Configuration), the import and export of GPO and Windows Management Interface (WMI) filters, and lots of other yummy things.
**It does all this fro
m the comfort zone of a management console interface. It means you can get up to speed in no time at all, even if you know you're in for a massive head-down learning experience as you attempt to get to grips with the power that has just been placed in your hands.
**The console can be run on a Windows 2003 server, naturally, but it can also be run from a workstation running Windows XP Professional so long as it has SP 1 and a particular post-SP 1 hotfix installed already (which is handily
supplied with the GPMC download). The GPMC enables you to manage multiple forests from the same console and, once you have a Windows 2003 server set up as a domain controller, you get access to something known as the Resultant Set of Policy (RSoP) feature of Windows Server 2003. This has two modes: Planning and Logging. The former enables you to simulate the effects of your group policies before you actually apply them on your live systems. The latter provides you with information showing
you what exactly happened to a user/ computer after a policy was applied.
**These features are found under the Group Policy Modelling and Group Policy Results folders respectively, and the two other folders in the simple interface are for the domains - all presented as peers regardless of their DNS relationship, as group policies aren't inheritable between domains - and sites. While the data for most of these items is automatically generated, Microsoft chooses to let the administrator dec
ide on the number of sites that will be displayed in the Sites folder, something that will probably be well received by organisations with a large number of sites that would otherwise have required lengthy probing.
**In short, Windows Server 2003 marks a genuine change of approach by Microsoft. I hope it proves to be a solid platform from the word go, as I like the idea of locked-down systems very much. I can also only offer up the hope that Microsoft is planning to do a ground-up rewrite]
of Internet Explorer based on its new secure code initiative, as I'm sure that would also be very well received by everyone who uses Windows and the Internet on a regular basis.
HgDavid Moss watches Microsoft turns the tables on its critics, increasing security and helping sysadminsL
August 2003X
106 (August 2003)i
Back Office
honeyballA
jewells
honeyballA
honeyball
david
mossr
honeyball
david
mossi
jonesB
kevinE
ebooks
palmW
freeR
frontpageS
frontpage
2003S
onlineN
inchesD
interestedF
justO
long-trouseredC
long-trousered
lossV
profileV
mannersQ
colour
officeU
office
office
appraisal
office
brother
water
oldies
button
xpress
right
track
write
track
track
marks
on-line
on-line
backup
one-stop
one-stop
solutions
onlineN
online
defamation
interact
resolving
resolution
retiringe
retiring
earlye
roots
san-ityr
schizoid
schizoid
score
trotA
trotted
troubleG
trouble-free
trouble-shooter
troubled}
troubleshoot
troubleshooter
troubleshooting
troublesome
troubling
troughs
troupe
trouser
trousersC
trove
troves
tru64
truce
truck
volume-licensing@
voluntarily
voodoo3
voracious@
vstudio/device@
w32/magistr-a@
wager
wailing@
wait-and-see
waited@
waive@
walking@
walkthroughs@
wandered@
wandering
want@
wanted
wanting
wap-alike
war's@
watches@
watercolour@
wave@
waved@
wayback@
wayside@
wdheaderfooterprimar@
wdmaintextstory@
received
interesting
email
arturo
bristol
aski@
we're@
we've@
we've
concentrating
technical
wealth
weaned@
weapons@
designers
might
admit
web's@
web-activated@
web-browsing@
webcoders@
webring@
week@
templatesC
templatesize
temple
tempoa
temporarilyd
temporaryB
temport
tempos
actualA
ad-awarem
ad-blockingc
ad-displayingm
ad-filtering
ad-freeS
ad-hocK
ad-sponsored
ad-watchm
ad/da
ad/dns]
adagex
adamantg
adams
adaptP
simplifyC
simplifying[
simplistic|
simpson
simulateC
simulationZ
simulator
simultaneousa
simultaneouslyD
sinceD
sincerestS
sing\
singer
singers
henceA
heuristicc
heuristicsQ
hewlett-packardA
hexadecimalZ
heywoodX
hfnetchkr
hfnetchkltr
hfnetchkpror
responseA
restartF
restarted
restarting
H~David Evnull reveals the subtleties that differentiate not only Unix and Internet mail, but acronyms such as MDA and MTA, too.L
February 1998P
40 (February 1998)i
UNIX/UNIX SERVERSj
Paul Ockenden and Mark NewtonD
Changing the Script
Welcome to a brand-new Real World column about designing for the Web. So does the world really need yet another Internet column? Well, yes and no. Yes we believe there
s a need for a column like this, but no it isn
t simply another Internet column.
PC Pro already has an excellent column for Net users by Davey Winder, but we intend to write about the Net from the other side of the fence - from the point of view of companies trying to establish a credible and effective Web presence. We
ll co
ve begun to match the Palms in being small enough.
**The same criterion applies to laptops. Ever since the first Compaq 'sewing machine' portables, I've always wanted smaller and lighter units, and I use several Sony and Sharp ultra portables with some satisfaction. However, with laptops, as with PDAs, enough is enough. The smallest VAIO models with distorted screen shapes may be miracles of miniaturisation, but most applications aren't well-suited to their display format. Packing for a t
rip is when the trade-off between size, weight and functionality really bites, and I often find myself taking a 'proper' laptop and a wireless Palm in preference to my VAIO ultra portable.
**Is there such a thing as too big, though? I'll soon be in a position to answer this as I've indulged in one of Apple's new 17in PowerBooks. Many of us use laptops in situations where they're not required to be very portable, although in a completely immovable application a desktop PC should logically
win every way up. It has a higher specification and a cheaper price than any comparable laptop, is more flexible for upgrades and cheaper to run. The components in most desktops can be individually replaced, repaired and upgraded, whereas the majority of laptops have to be replaced if a single component fails. Also, because of their compact structure, they suffer more from heat stress and other traumas.
**Having said that, and cost apart, a laptop can still make an effective desktop subs
titute, especially a large-screen laptop, as I find that I work more effectively on large displays. On a small display, I have to stick to one task at a time, but with a large display I can keep several applications open simultaneously and switch between them. I find Mac OS X better than Windows XP for this, as its default window design is better suited for keeping multiple windows on screen at once and switching between hidden states of applications. If I'm dependent on peripherals, thoug
h, forget the laptop, as switching between scanners, tablets and specialist devices can be a major hassle. Even so, FireWire and 802.11b are making it less of a hassle than it used to be, and I hardly ever need to reboot to swap devices nowadays, and routinely plug in a FireWire disk for backup. I'm impressed with those USB key-fob memory sticks, but they still don't hold enough for my typical usage.
**My experience so far with the 17in PowerBook has been rather brief. It certainly seems
effective enough for my needs, but there have been teething troubles that have prevented me from being able to subject it to the full gamut of plane flights and hotels, though for day-to-day programming and general work it's good. Its AirPort reception is much better than with the old PowerBook I'm using to write this column - as good as the iBook, I'd guess, without actually measuring it. My main reason for using PowerBooks is that they support a dual-monitor mode that's excellent for run
ning training courses - add an external projector and a Keyspan remote control, and I'm in business. Without this dual-monitor facility, I'm stuck with nowhere to read my notes on the slides (apart from the very fallible projector inside my head). Of course, the 12in model would be perfectly adequate for this application. But given my preference for more screen real estate when programming, the 17in was much more attractive, and I grasped the opportunity to size it up for more general port
able use as well as this specific application for training courses.
**The reason I'm still using my old PowerBook instead of the 17in one is that on its very first day I inserted a CD to evaluate its sound quality, and the machine started hard crashing (which is very unusual). Removing the CD and running a manual fsck (that cosy old Unix standby: I'm glad that some things never change) got it running again, but now without the AirPort working, which was a real pain.
*I was in the middl
e of a high-pressure programming project so I didn't have time to do much about it. I used Disk Utility to rebuild its disk, disappointed that I couldn't run the more dependable DiskWarrior from Alsoft. It might have been possible with a lot of effort, but it wasn't worth it to me - DiskWarrior requires a Mac OS 9 bootable machine and doesn't yet support checking from a bootable Mac OS X CD. So I reformatted, repartioned and reinstalled, but still no working AirPort. Apple provides the Pow
erBook with a 12-month warranty, so I invoked it. The support service agreed with me that it was broken, but couldn't repair it - I assume that was because the AirPort hardware is now built-in and not on a removable PC Card. So I'm now waiting several weeks for Apple to get more in stock, which is a pretty poor show for warranty support.
**AirPort range
*I've gone on at some length previously about the PowerBook G4's rather strange aerial mechanism, wherein the AirPort antenna is enclos
ed within a metal chassis so that it has a far shorter effective range than the iBook (which is especially good).
**How Apple managed to produce one product with better-than-average range immediately after one with abysmal range I can't explain. The new 17in PowerBook model is, as I've already mentioned, a lot better and comparable to the old iBook. However, if you have an older PowerBook G4, I do have a couple of tips to pass on.
**The first couple are simple and don't involve breakin
g any warranties. First, remove the power adaptor cable and your range will suddenly improve. The rationale behind this one is that the poor reception is caused by all the metal in the case acting as a Faraday cage and absorbing all the power from the network transmissions. The (imperfect) earth coupling provided through the power cable increases the effectiveness of this Faraday cage, reducing the energy of the waves getting through to the AirPort antenna inside the case.
**The antenna w
ire is connected to the PC Card format AirPort receiver in the middle of the casing and is connected to a wire or metal strip situated inside the outer frame of the case. This runs through the frame next to the battery compartment, and apparently a connection in there can sometimes come loose. If you remove the battery and look inside the compartment, there's a plastic strip with the identification information (model details and date of manufacture) on the right. Press firmly on this strip
and run your fingers backwards and forwards along it a couple of times and your reception may improve - it didn't help me unfortunately.
**The two other solutions are slightly more extreme and will break your warranty, but they should be perfectly safe to carry out. One is to modify your AirPort base station to use a proper aerial - non-Apple ones may already have an aerial, or at least are likely to have an easily reached socket for an external aerial. The Apple base station uses a stan
dard PC Card 802.11b device with one of these aerial sockets, and if you open the case (warranty now void), you'll be able to see this, so drilling a hole in the case to take a standard aerial wire is the worst that you'll have to do.
**Given that the (old model) PowerBook G4 uses a standard PC Card AirPort card, you can also remove it from inside the case and reinsert it in the external PC Card slot, attaching an aerial to the end of the card - this is a standard fitment from other suppl
iers of the same cards.
**Newton Exhumed
*I was reminded recently that we've passed the five-year anniversary of Newton being buried by Apple. The more I think about it, the more I consider that Newton was the best PDA ever made. It had some problems, basically to do with size. The first MessagePad, known as the OMP ('Original MessagePad'), had the best form factor, but the larger display and writing area of the last model, the MP2100, had its positive points. Nowadays, it would be consid
ered a mini-tablet rather than a PDA, while the eMate, designed to use the same operating software, was close in concept to current Tablet PCs. Newton was ahead of its time in many ways: its handwriting recognition is still easily a match for any current handwriting-recognition software, it used the ARM processor well before the current fashion and supported networking very early on, giving it a modern design.
**The Newton also gave birth to the most successful of PDAs, the Palm series. P
alm started out using Graffiti, developed originally as an alternative handwriting-recognition system for Newton, while Palm OS was an excellent attempt to mould the same user interface onto a much lower- resolution display, with the same high degree of attention to detail over usability. The Newton user community is still very active - I checked a couple of the software developers I find most reliable for Newton software and they're both in business and able to supply Newton products. New
ton has an active mailing list as well as some up-to-date software - if you check my list of links, you should find an 802.11b network driver, Mac OS X synchronisation software and updates to the IrComm software.
HFPaul Lynch finds that the new PowerBook really is the Apple of his eyeL
August 2003X
106 (August 2003)i
Mobile Computing
Kevin PartnerD
Thinking ahead
Creating multimedia projects is all about sharing. There's no fun in putting together the most sophisticated, kick-arse presentation, game or e-learning application only to find that it works on your own PC and no-one else's. And if you create multimedia applications for a living, the future of your business depends on your masterpiece working properly both on your client's PC and, often, on your client's clients too.
**This is true for most software development, but, by its very nature,
laptopsD
abled object ready to be accessed by remote apps. ObjectSpace is already saying that Voyager 1.1 will also talk CORBA to enable it to plug into other CORBA applications; this should also allow it to talk to C, C++ and other languages.
You may wonder where Java
s RMI (remote method invocation) fits into the picture. It doesn
t. Voyager sidesteps RMI, taking on all its own work but leveraging all the language enhancements that were added to support RMI. My experience is that in developing wi
th it and running code, Voyager is faster than RMI. Add to this Microsoft
s reluctance to release and support RMI and there
s the makings of a dilemma - it
s your call whether to ask folks to install RMI on IE or just send the Voyager classes. Me, I
m sticking with Voyager - for now anyway. All this fab stuff can be downloaded from www.objectspace.com along with everything else you need to start making those distributed mobile agent systems.
ubiquitousB
ubiquitously
ubiquity
ubound
ubr's
ucallbackmessage
ucase
ucr-converted
ucspectrum
uddi's
uddi-org
uddi10
uddienv
uddierrno
uddirequest
udma/100
udma/66
udp-based
uflags
ugliesto
reasonsA
reassemble
reassembled
reassembly
rebootingJ
apple'sD
apple-based
apple-cued
apple-developede
apple-ification
apple-supplied
appleman
appliedC
appliesC
applyC
applyingF
appointee_
appointmentK
appointmentsK
appreciateA
appreciatingQ
approachC
approachedL
approachesB
approachingV
appropriateE
appropriately[
approvalC
approveC
approved_
appsG
aprilH
arbitraryL
archiZ
architectT
architectureH
archiveM
archivingN
ardourA
gy, but the best part is it
s free for commercial and non-commercial use. Yes, that
s right. Like JGL before it, Voyager is free to everyone except, of course, IDE and other development environment makers who
d like to add it to their toolset. Fair enough, I say; if they
re going to add a feature to their system, they should be made to pay for it, like the numerous companies that licensed JGL. Free software can mean ropey documentation or an absence of support, but ObjectSpace skirts round
both of these pitfalls with a 400-plus page manual and informal (but excellent) support through mailing lists.
If you want more, proper commercial developer support and major enhancements/add-ons can be had by plying ObjectSpace with money units, which is where it makes its crust. Experience with the JGL shows how well it supports software (now in its third major version), as it now has numerous enhancements including Voyager support, so you can create an Array or HashMap as a Voyager-en
considerD
es objects between a client and server to pick up data sets, and I
ve managed to pick up the basics of Voyager in less than a week. I say the basics simply because there
s so much in Voyager to explore.
Voyager comes with a simple persistent data store so that you can save objects and resurrect them automatically when someone calls for them. There
s a resilient messaging system too that allows for point-to-point or one-to-many messages between objects. It also has the concept of linked spa
ces that automatically pass messages safely between each other to avoid dreaded message loops. Then there
s support for applets that want to be Voyager clients so that they too can join in the networked distributed fun. On top of all this are agents, which in Voyager are objects that can move themselves around the network, picking up data and then returning home to spill their guts about whatever they found. In fact, this is how I
ve envisioned
agents
for a long time.
s great technolo
framework for developing mobile objects and agents, and supporting distributed Java-based apps.
Perhaps the most stunning feature of Voyager is how easy it is to use. You take your Java class - say, Calculator, which has a method called add - and run the supplied
utility which creates a virtual version of the class identified by a
at the front of the class name. So Calculator.java is then complemented by Vcalculator.java. Compile the two and you can now write a program that creat
es an instance of the Calculator along with a virtual version - the world is then your bivalve with enclosed pearl.
Other programs running on other networked machines can locate your new virtual Calculator and call its methods, say, the add method. This virtual object can be moved from machine to machine, so you can create a load-balancing network or else just move objects around to bring them closer to their data sources or related objects. I
ve been working on an application which shuttl
handheld
pocket
lynch
steve
cassidy
isp's
issueL
issues
Steve CassidyD
Remotely interested
Two months back, my colleagues in PC Pro's Enterprise section reviewed a clutch of remote-control tools - those rather sleight-of-hand gadgets that let you control a remote machine from inside a window on your PC's desktop (issue 104). These have become quite the thing to have in corporate environments nowadays, so much so that, in a couple of extremely large networks I've seen, it often feels as though you're doing your actual data processing using the handful of CPU cycles left over afte
a multimedia application depends on the presence on the host system of, for example, the correct sound and video drivers, ActiveX components and codecs. For us developers, it's rarely acceptable to simply insist our end users have the latest operating system plus patches, DirectX installation, memory requirements and up-to-date drivers. Rather, it's up to us to make sure our applications work on the widest range of setups. This sort of attention to detail is mission-critical - at least as
important as any other aspect of development - and failing to pay attention to it is the quickest way I know to undermine your credibility with both clients and users.
**Of course, the simplest way to achieve this compatibility is to create multimedia applications that require the bare minimum configuration. At NlightN, we support the following minimum spec for most of our projects: Windows 98 SE, 64MB of RAM, Pentium III/350, Sound Blaster sound card, DirectX 6 and 16-bit colour depth. C
hoosing such a minimum specification needs to be discussed with your client, but you must bear in mind the risk of working to a very low spec. You need to pick a specification that that will allow your software to work acceptably while not denying you the bells and whistles that are so much a part of a multimedia project.
**Naturally, you won't want to actually create the software on a low-spec PC, so it's essential that you invest in a bare-bones PC, reserved solely for the purpose of t
esting. Get a copy of Drive Image and save an image of the fresh installation so you can restore a clean Windows after each testing session, just in case your setup program adds components. The good news is that such a PC will cost you next to nothing and will repay itself over and over again as you catch configuration errors before distributing software to your clients. Any large-scale multimedia project should be prototyped on the target PC as soon as possible, so any compatibility issue
s can be resolved before the project goes too far. The bigger the project, the more you have to lose by leaving testing too late.
**As a rule, users and their IT administrators don't like setup programs that force the installation of huge libraries such as the .NET Framework, video/audio libraries or DirectX. With many corporate clients nowadays, requiring such an installation can kill off a project if not dealt with at the beginning. Again, there's a balance to be struck here: your multi
media product has to be fully functional, but it's better to use only formats and controls already supported by your user's PC.
**For video, the best format is usually MPEG-1, which offers reasonable playback quality, low hardware demands and widespread compatibility. If you must use AVI, pick either the Indeo 5 or 3.2 codecs, as these are widely supported and offer fair quality. Avoid QuickTime movies unless you relish the idea of talking your users through QuickTime's installation proc
ess and the problems it regularly causes.
**If you do decide on MPEG-1, you'll need to think about the optimum combination of image quality, file size and compression for your particular project. As a rule, compress as little as possible, because this places less of a strain on the processor of the playback PC. It does, however, also result in bigger files and therefore a more bulky installation. Again, it's a question of finding the right compromise, and experimentation is important.
*With audio, your choice is between MP3 or WAV format. If file size isn't an issue, WAV is the ideal choice, as it will play back on every conceivable Windows installation. For audio narration, I tend to use a 44kHz sample rate at 16-bit resolution in just a single channel, while music uses the same settings but in stereo. You can get away with 22kHz, but if you're doing that for reasons of file size you'd be better off choosing MP3 playback at a higher quality.
**MP3 playback is now alm
ost ubiquitous and offers excellent compression. It places a slightly higher strain on the processor, but even our minimally specified system should be fine. We use a setting of 64Kb/sec in mono for narration or in stereo for music. Unless, that is, we're developing in Mediator 7, which has an odd dislike for certain settings on particular PCs. MP3s that play back perfectly through Mediator on most PCs seem to play at a much faster rate on other PCs while playing in Media Player without pr
oblems. For some reason, this affects certain sample rates and not others: 96Kb/sec seems to work perfectly on all systems, whereas 56Kb/sec and 64Kb/sec don't. This is presumably because MatchWare re-wrote the sound playback engine from scratch for the latest version of Mediator and hasn't quite ironed out the bugs yet.
**You'll also have to decide on screen resolution and colour depth. There are a number of good reasons for sticking with a resolution of 800 x 600. First, this resolution
is supported by all graphics cards and monitors built in recent years. Having said that, the vast majority of systems will also support higher resolutions, so why not go to 1,024 x 768 at 16-bit colour depth? By increasing the number of pixels on every screen from 480,000 to 786,432, you must also increase the size of all the images in the presentation and, crucially, the video clips. This can have an impact both on performance and storage requirements.
**One obvious solution is to deve
lop at 800 x 600 and have the presentation change the user's resolution and colour depth automatically, restoring their original settings at the end. However, this is a bad idea. Not only do you run the risk of hanging a machine when either setting or resetting the display, but it really narks users to have their screens resize under your control, especially if the presentation no longer quite fits on the screen. A better option is to present your program at a fixed size within a black sur
round - the user can then change the resolution themselves if they choose.
**Colour depth is simpler, since just about every PC in use today can operate in 16-bit colour. I develop at 32-bit and then test on a 16-bit display. The differences tend to be particularly noticeable in graphics that contain many colours within a small range - for example, black and white photos and gradients.
**Any other dependencies will vary according to your authoring tool. Director, Opus and Mediator can al
l output to a single EXE file containing the media, code and run-time components, so deploying projects created using these should be a piece of cake, right? In the real world, there are several reasons why packing your presentation into an EXE is not a good idea, particularly for large, complex projects. One of the key requirements of a major project we're currently working on at NlightN is that it should be easy to update. By binding all the resources and programming into a single EXE, i
t becomes impossible to update without replacing the entire EXE.
**In the case of Director, it's standard practice to create a stub projector that contains all the required Xtras but otherwise does nothing more than start up a protected Director movie. The resources and code in that movie are further subdivided across various protected casts. If a bug arises, it's often just a case of replacing the external cast containing your scripts. Similarly, if a graphic has become out of date, it's
just the cast containing your graphics that needs to be replaced.
**This is a little more difficult in Mediator, but ultimately more flexible. By default, Mediator creates a stub EXE and consolidates all the resources into the same directory as the Mediator presentation file. Let's imagine, however, that you deploy via CD-ROM but then want to change a graphic. Clearly, that graphic will need to be stored on the user's hard disk along with an updated Mediator document that points to it. M
ost of the graphics are still on the CD-ROM, so the original Mediator document would have to be amended so that all image objects point to the CD-ROM except the one that has been amended - quite a job for a single amendment. Our solution for this has been to replace all literal paths in the Mediator document with variables. By creating a cdromDrive variable on installation, and sticking to a standard directory structure, it's possible to account for any likely drive configuration on the en
d user's PC. In this way, if the Mediator document is updated, the only amendment needed is to ensure that the variable pointing to the old graphic is replaced with a literal path to the new one on the hard disk. The other resources will still be pointed to by the cdromDrive variable.
**In fact, we're using both Director and Mediator for our project. My intention had always been to create a complete Mediator installation, including run-time components, in various subdirectories on the CD-
ROM and point to them using a Mediator menu. Unfortunately, Mediator's 'Open Md8' action doesn't support variables, so unless I'm prepared to create IF structures to account for every possible CD-ROM drive letter from D to Z it's completely useless.
**So I turned to Director to create the menu. This is a hugely complex job, to which Director is well suited. It has to manage user accounts on the local machine, communicate with an online database to check whether they've been assigned to t
he e-learning package, retrieve their course-completion data and reflect that on the menu, check for and handle course updates, monitor progress, launch the appropriate Mediator document and post their updated course-completion information to the server.
**In this case, I've created my stub projector, plus a main module called menu.dir (dxr on deployment), a cast containing all text components and a separate cast containing the scripts. This way, if the menu needs to be updated, the file
size is kept to a minimum. Configuration information is kept in three INI files and I've made extensive use of the excellent Buddy API Xtra, probably the most useful of all the extensions available for Director. This menu system has been created so that it can be altered by our designers without needing to recompile each time. It generates the menu list dynamically and can even prompt the user to insert the appropriate CD if the project runs over more than one.
**Both the menu system and
the Mediator movies use various Windows ActiveX controls, so it's also essential that you ensure the appropriate controls are installed on the end-user's PC. Doing this in a seamless and professional way is beyond the scope either of the installation program built into Mediator or any custom program you could create in Director. Not surprisingly, the only way to ensure you have a professional setup program is to buy a professional setup program creator.
**My favourite, in terms of value
for money, features and usability, is Setup Factory by Indigo Rose Software (<A HREF="http://www.setupfactory.com" target="_BLANK">www.setupfactory.com</A>). For our current project, Setup Factory carries out all the usual functions: asking the user for an installation directory, presenting licensing information and creating a shortcut. It also creates a configuration INI file, inserts the CD-ROM's drive letter and installs and registers the ActiveX controls. It finishes by creating an un
install file and shortcut.
**The impact of using a professional setup program like this can't be overestimated: it's your user's first impression of your presentation (and, if it's done badly, also possibly their final impression).
**Indigo Rose has also created TrueUpdate, which, as its name suggests, allows you to check for online updates, download the software and install it. In my case, the menu system will run TrueUpdate as part of the user logon process, check for updates on our w
eb server and invisibly transfer the appropriate media or program files to the user's hard disk. I haven't yet integrated this into the menu, but I'll let you know how I get on with it.
**A free lunch
*I've been using Microsoft Word since version 2, but it seems to have become ever more buggy as time has passed. A friend of mine had written a 350-page multimedia script containing a two-column table on each page. She had placed lots of frames and graphics into the right-hand column, but, a
fter nearly completing the script, Word froze upon opening the document.
**After trying out a couple of downloadable document-fixing utilities with poor results, I tried the old trick of attempting to remove the advanced formatting that was probably causing the problem by saving in a previous format before it crashed. The document remained stubbornly unopenable.
**In desperation, I downloaded and installed OpenOffice.org, following PC Pro's excellent 'Can you ditch Microsoft?' report o
n office packages. I opened the broken document in the Writer application and the results were fantastic. Not only did the document open, but it preserved almost all the advanced formatting and, once saved back into .doc format, it opened perfectly in Word too. The only major problem was that, for some reason, all arrows and lines shifted to the left and needed to be readjusted. But it was a small price to pay.
**Once it was installed on my system, I decided to take a closer look at OpenO
ffice. It's based on the code used to create Sun's StarOffice suite, which, ironically, made a much poorer job of opening the Word document. Sun donated this code and it's now open source. The word-processing application is fully featured and feels more responsive than Microsoft Word.
**The same goes for the spreadsheet application. However, of most interest for this column is the presentation package Impress, which offers all the facilities you're likely to need for most presentations.
It includes a good range of downloadable templates, effects and drawing objects and an especially good 3D object. There really is very little PowerPoint can do that Impress can't. The transition was simple - it took me just a few minutes to get to grips with it and my only gripe is that there appears to be no equivalent to Microsoft's admittedly clunky Pack-and-Go facility for distributing presentations.
**Impress, and the other applications in this suite, are genuine alternatives to Mic_
rosoft Office, so much so that I'm trialling the suite here at NlightN with the intention of using it as a replacement for Office for most of the people here. I've written this column in OpenOffice.org Write and it has been a joy.
HbCareful deployment and installation are key to a successful multimedia project, says Kevin PartnerL
August 2003X
106 (August 2003)i
MULTIMEDIA:VIDEO
overheadB
rheard
overheated
overheating
overinking
overjoyed
overkill
overlaid
overlap[
overlapped
overlapping
overlapsr
overlayd
overlayingd
overlays
overloadZ
overloaded
overloading
overlong
overlook
examples, shell scripts and a trove of useful public domain and free software, and what you have is a must-have book that all Unix users should keep next to their desk (from whence it will inevitably be tea-leafed).
The second edition of Exploring Java brings this rather good tutorial bang up to date with Java 1.1; coverage has also been extended to RMI, JavaBeans, Inner Classes, and all without disrupting the well thought-out flow. It
s a solid revamp and with Java in a Nutshell, it
s pri
me material for my Java quick-start book list.
Voyager sets off
d like to tell you about some exceptionally cool technology. Regular visitors to this little nook in PC Pro might remember me recommending the JGL (Java Generic Library) from ObjectSpace. JGL furnishes developers with all those handy-dandy data structures such as collections and algorithms that make life so much easier when doing real apps in Java. Well, ObjectSpace has been busy beavering away on its next work, Voyager, a
@theweb.co.uk email address and I
ll always try to reply. Actually, make that two pleas: if you don
t get a reply, I
d appreciate it if you didn
t send me nasty mail as it may be that your mail has simply failed to get through.
Book stop
This month I
d like to mention two rather treasured books that I
ve just had stolen, Unix Power Tools and Exploring Java. I
m not about to go all Nick Ross on you, as I know these O
Reilly books are rather desirable and as such it
s perhaps inevitable the
ll be
borrowed
. If you
ve been unfortunate enough to have either of these books lifted, second editions have just been published, giving you the perfect excuse to go out and buy new copies.
The new
Unix Power Tools
is pretty much the same as the original in that it
s the essential grab-bag of Unix hints and tricks. Flip it open at any page and learn how to control your command line shell, or how to use built-in Unix commands for clever file manipulation. Add into the mix a CD-ROM of
setupA
developingE
developing
client
server
information
systems
enough
developing
multimedia
applications
complicated
business
prettyA
entitledA
entitlement
entitles
entitling
entity/relationship
entomology
entrails
entrain
entrance
entrant
entrants
entrenched
entrepreneur
entrepreneurst
entriesC
entropy
entrust
entrusted
variablesE
variably
variance
variant
variant/date
variants
ector-editingX
vector-only[
vectorised[
vectorize[
vectors[
vehicle_
venables
vendorX
vendor'sb
impressionE
impressionable
impressionistic
improvesj
improvingN
improvised
impudent~
impunity
desperationE
despise
despised
despiteG
sun'sE
sun-loungerp
sun-manufactured
sun-wards
sunday
sundays
sundry
sunflat/pilot
sunflower
sunflowers
sunglassesI
sunken
sunlight
sunnie
sunny
sunos
sunrise`
sunscreen
sunset
sunsets
sunshines
sunsite
sunsoft
supct
super\
super-charge
super-clean
super-dooper
super-efficient
super-fast
super-fine
super-flash
super-floppy
super-hero
presentationsE
presentedC
careful
deployment
installation
successful
carry
carrying
cascadingy
cases
cashing
cassidyF
cassiopeia
changing
channels
charity
cheap
checking
checklist
checks
cheques
chest
child
choice
choosing
choosing
service
could
christmas
circulate
client
clients
close
closer
cmykd
Davey WinderD
Semantic traps
There are times when, if I had any hair, I'd be pulling it out. Let me point out the obvious here once more: never click on a link in an HTML email unless you're 100 per cent sure of its provenance. I'm not against HTML mail per se - well actually, perhaps I am, which is why I use a grown-up mail client that doesn't automatically fetch whatever dangerous crap some total stranger has attached to their bloated, unsolicited communication.
**My client of choice, The Bat! (<A HREF="http://www.
r the software updater, the anti-virus utility, the email notifier, the system monitor and someone's remote-control application helping you to find the right button to press have all taken their bite.
**To some extent, I approve of remote control and use it a fair bit, even though I find the process of remote controlling a machine can often interfere with what else it may be doing in ways you may not foresee. I still haven't quite got over the shock of finding a Lotus Notes server I was u
sing crawl to an almost dead stop and then speed up the instant I removed VNC.
**There are lots of installations out there with tens of thousands of copies of VNC running - because it's free and it isn't platform specific - and I won't be surprised if my mailbox now fills up with sage advice split into two camps; namely, 'Don't use Notes, then!' versus 'Something else broke it!' However, VNC is a very lightweight application in comparison with its distant opposite in the remote-control a
rena; that is, turning on Terminal Services under Windows 2000 Server in Application Mode. That sets in motion several tens of megabytes of stuff, directly descended from the fully customised and recompiled version of NT that is Citrix MetaFrame. Even a brief perusal of what's required to make various applications run under MetaFrame is enough to make me cautious about turning on Terminal Services on servers that have other serious jobs to do. I remain slightly cautious about remote contro
l for a couple of widely separated reasons:
**1. The market is somewhat divided: there are utilities that just do a remote inventory, and then there are the ones that are all about remote control. Remote inventory types like having the option to take over a machine and find out what that wicked little utility might be which someone has downloaded without authorisation.
**Remote-control applications, on the other hand, stretch all the way back to those modes in pcAnywhere and GoToMyPC, w
hich allowed you to connect to your single-user PC from 'anywhere on the Internet'. It's quite hard to ensure that you're using the right utility suite for your mix of purposes.
**2. The usual justification is that you need remote control for all your users so you can support them properly. This noise becomes ever more strident as one moves up the scale of networks, until you discover that you're dealing with UniCentre or HP TopTools or Tivoli or whatever. I still bear mental scars from a
n encounter with the install CD for an HP-managed switch that installed six separate background processes on an already heavily laden server, just so it could watch the state of the ports on the switch. Here in the small network market, we live in the shadow of the hundred-thousand-seat network and can lose track of what's different between our environments and theirs.
**But 'support' means what, exactly, in the smaller network? In those sites where large-scale remote control and/or audit
ing products tend to get rolled out, the number of support staff to user population can be in the one-to-a-thousand bracket. I've heard of one email system with 20,000 users that has this year halved the nerd count from six to three, for example.
**This is a radically different scenario from what we typically see around us. In my marketplace, the support ratio has a wide scatter but averages out so there's normally something like one supporter per 30 users, one between 100 only in extrao
rdinary circumstances. I can't think of a way in which a remote-control or audit product would accommodate the varied needs of the groups I work with, never mind spanning from them to those who have 10,000 or 100,000 people to worry about.
**Larger networks have different, not merely bigger, populations than smaller ones. It's important if you have 100,000 people all processing one of 20 possible permutations of the same billing and enquiry process that the software they use to do this is
as tightly bolted down as possible. And when something goes wrong, it's vital that a bright light should start flashing on your 20ft wall-screen and a platoon of heavies thunder between the cubicles to track down the culprit.
**It isn't so important that we run our little networks this way. My view of support in the smaller LAN is that you shouldn't be applying most of your effort to stopping people from doing things - it's hard enough getting a clear idea about what they do want to do
without having to battle through security borrowed from the Hard-Nosed Businessman school of corporate paranoia.
**Here's a simple test to help you tell whether you should be thinking about massive control or massive assistance. If you're on the high side of the user:supporter ratio, put effort into remote auditing and desktop lockdowns, unless there's a strong theme of individuality in your user base (say, lots of creatives or mobile users with industry-specific peripherals, like survey
ors or doctors). If you're more relaxed (that is, below that magic 1:30 ratio) and your duties are more about training than 'supporting', you need remote control much more than remote auditing.
**You'll laugh with Hyena
*My interest in remote working is much more focused on what happens on servers than it is on the users' PCs. I naively expect most of my clients' workers to have real work to do that doesn't involve breaking the tools they use for that work. Most of my interest is in contr
olling servers, and there's almost no need to indulge in full-screen remote control of a modern server.
**There are two methods of remote control to which I want to draw your attention. One is a product that has weathered both the test of time and Microsoft's attempts to kill it off by the art of pale imitation. I'm talking about Hyena.
**I first drew people's attention to this product many years ago, but even back in the early days of NT 4, interest in the idea was so immediate that I
got a mail from the author asking if all the traffic to his website was on account of the PC Pro coverage. Even then, Hyena would allow you to stop and start a range of services on your servers without ever walking up to their consoles or installing any extra software on them. But it went much further than that, offering the ability to stop a range of services across a farm of servers, or to apply the same change to a whole pile of user records without repetitive mouse-clickery.
**That's
a solid advantage, and even though Microsoft had a go at duplicating the capability inside Microsoft Management Console, there's still plenty of room in the marketplace for Hyena to carve a more-than-respectable niche. With its help, we can almost claw our way back to where we'd got to with NWAdmin under NetWare 5 in the mid-1990s.
**The other interesting little gadget for remote server fiddling is a bit more esoteric, and it's more about saving mouse-clicks and hiding the true extent of
your attention span than it is about remote control as such. Go sit at your Windows 2000 PC - server or workstation - and open a command window, then type 'netsh interface dump'. Observe the speed with which about 100 lines of guff goes honking past your eyeballs, then type 'netsh interface dump > mynetwork.txt' and open the file in Notepad so you can see what the hell is going on.
**Netsh is a complete language for laying out everything about a whole computer's network presence. It most
ly concerns itself with your local computer, or you can run it remotely, enquiring about the way a distant machine is set up. It works within a wide range of contexts - there are seven, each concerned with the services and layers that make up your machine's contact with the LAN. I know what you're thinking - how can there possibly be seven contexts? Surely, there's a network control panel, and that's it? Looking at the sample output from my workstation, you'd be forgiven for thinking there
are nil contexts - all it says is that I've configured the machine to use DHCP to get all its configuration data. From the perspective of netsh, this machine has nothing interesting to say. If I want to know about it, I could just read the configuration out of the DHCP server.
**Where netsh comes in seriously handy, though, is when playing with the networking on a server, and not just any simple everyday server, but one with what I generally consider to be the 'wrong' kind of work to do.
Just because it's possible for a machine running a server OS to route network traffic from one Ethernet card to another, doesn't mean that the work of a router in your network should be done by the same box that's holding all your files. And in the small LAN sector, when these two quite differently prioritised activities are found together, it's normally a sign of some unfortunate purchasing or design decisions in the recent past. Servers serve and routers route, geddit?
**Except, of cou
rse, that the real world isn't like that. In an email exchange over the last couple of weeks with one beleaguered reader, I got to hear about his stack of Dell servers in a co-location centre, which weren't providing quite the functionality he was expecting. In particular, his capacity calculations had led him to believe that he needed teamed Gigabit Ethernet - that is, more than one Gigabit Ethernet card per server - to support the vast population of users he was expecting, but had found
that the DHCP server process appeared to be ignoring the second Ethernet adaptor and only handing out addresses on the first card.
**This is the kind of territory where the friendly face of the Network Properties control panel starts to bite you hard. Just try having two network properties windows open at the same time when you have two LAN cards in the server - under Windows 2000 you get a prim little message saying, 'you can see the other window once you've closed the first one'. That p
roperty window is far more than just a display of your IP ranges and domains; it's a hook into the Services Control system and is ready to stop and restart anything that likes to talk down the LAN ports you're about to mess with.
**Like my frustrated co-location systems engineer correspondent, I hate tangling with those property dialogs. You're expected to track a consistent set of parameters across a wide range of interfaces, all through several little dialogs, some of which are multiple
choice, others twisty expand/ collapse views.
**As soon as you try to do something genuinely difficult - which includes most of the configurations of servers with presences in a DMZ - the graphical user interface simply runs out of road. This is where netsh can take over. It shows you huge amounts of detail concerning the environment of your server, and rather like iSQL it presents it in a re-parseable format.
**Let's suppose that for some reason (close to mental illness) you have thr
ee Ethernet cards in a server and that none of them are teamed or pooled. You want to make an experimental change to the overall configuration, to stop the traffic from Ethernet A from reaching B but enable it to go to C. The first move in netsh terms is to dump out the current config in the 'interface dump' format. That way, you have a single, human-readable text file of your config and if you make a complete hash of it you can always play that all back in again. Or, if you're being sneak
y, you can try editing the dump file to achieve the configuration you want and then play that back in.
**Shurely shome mishtake
*Notice, as you stagger through the maze that opens around you once inside the netsh command line (it truly qualifies as its own little world), that the information you need to understand what netsh is actually for comes from some oddities in the error reporting. While scouting around for the source of a problem with static routes in a network with several subne
ts, I happened to type 'netsh routing ip show persistentroutes' and the error message this precipitates is shown in the screenshot above of my machine - which, like the server I was working on, is indeed not installed with RRAS (Routing and Remote Access Services).
**Netsh doesn't concern itself with what workstations have been configured for as humble members of a LAN; it's all about taking control of the central services of all your servers, about having text-file-captured printouts of
all the static WINS mappings within your entire domain, showing how many packets your paired Ethernet cards are pushing through your email border router between the DMZ and the internal network.
**As such, netsh is the antithesis of Hyena - a text-only, batch mode, offline, syntax-driven utility for defining how your server or servers should work together and for finding out how they're getting on with it. You can read all about this powerful utility in (most likely) more detail than yo
u ever thought necessary at <A HREF="http://www.microsoft.com/technet/treeview/default.asp?url=/technet/prodtechnol/winxppro/proddocs/netsh.asp" target="_BLANK">http://www.microsoft.com/ technet/treeview/default.asp?url=/ technet/prodtechnol/winxppro/proddocs/netsh.asp</A>. You may judge for yourself how deeply technical this tool is, if like me you believe that size matters and the longer the URL, the more useful its content is likely to be.
**If you're now sitting thinking, 'Okay, great
for huge networks, but useless for me sitting here at home', please join me in a small experiment. There are some neat client-side diagnostics you can get out of netsh that should show you how much of the background infrastructure of your network (according to Microsoft's way of seeing such things) may or may not be present.
**Open a command window and type 'netsh' <return> and then immediately 'diag' <return>. You're now in the Diagnostic Context. Type 'ping' and you should see a
list, not of host addresses you can ping but of classes of host addressed services you can enquire about. 'Ping DNS', for instance, will look at all the DNS servers in your network setup and try to ping all of them, just from that one command. Remember, you can have a whole gaggle of DNS servers (and, if you've been reading Jon Honeyball's comments on this matter, you should have been adding DNS servers to your Windows 2000 Server network like mad).
**Try this from the command line - 'n
etsh diag ping DNS'. This provides a neat answer to how your machine is configured as far as DNS is concerned, and whether it can see what it has been told to use. Extend that a bit further to 'netsh diag ping ieproxy' and it will tell you whether a proxy is a) defined and b) reachable from your computer. 'netsh -r 'hiscomputer' diag ping ieproxy' will tell you whether 'hiscomputer' can see the proxy server without you ever getting up from your desk.
**This offers quite a different level
of diagnostic capability from using a GUI-based utility. To check whether a proxy is set up and working, you need to be in the Advanced tab of Internet Explorer, and you need a command window open to then ping the proxy address and see if it responds (before getting into the more complicated world of proxies, which might swallow or reject ping packets).
**I haven't yet had the chance to fully experiment with the possibilities for this command within a network where people aren't allowed
to run command lines themselves, or to change settings in Internet Explorer due to Group Policy restrictions. However, I'll be honest and admit that this is the first time I've been genuinely intrigued by a command-line program since before the advent of the IBM-compatible PC.
HPSteve Cassidy remote controls his networks using two very different applicationsL
August 2003X
106 (August 2003)i
Networks
who think I really need to know about erector spray, clearing my credit records or offshore banking (as if you
d bank with someone who
d just spammed you!).
The limited scalability and security of SMTP has prompted the creation of newer protocols that go way beyond just shoving mail around the Net, such as secure exchange protocols that just look at bulk mail transfer and synchronised mail databases. However, all this is still to come and for now you
ll be living with SMTP, POP3 and IMAP4
, with sendmail as the engine that passes it all around.
Begging letters
After digging through my mailbox this month, I feel obliged to make a plea: if I
ve reviewed something I
ve either disliked or found fault with, please don
t imagine that I
m going to give it away. All the Unix systems I
ve covered remain on my hard disk for reference - and no amount of pitiful pleading mail is going to give them a ticket out of Evnull Towers. All other mail is welcomed though: send it to the evnull
ariseC
arisenV
arisesC
arithmeticU
arizona
arm's
arm-based
arm-compatible
arm-only
arm-stretching
arm-waving
armada
armageddon
armchair
armedG
armitage
armouryC
remainedE
remainingP
subject
keeps
raising
windows
dns-less
dns-returned
dns-style
dnses
unifiedA
uniformG
uniform-width
uniformity
uniformlyA
unify
unifying
unilateral
unimaginable
unimaginative]
unimaginativelyf
unimpeded
unimportantV
unimpressed
unimpressive
unindexed
uninett
uninformative
uninitiated
uninspired
uninspiring
uninspiringly
uninstallE
universalB
hundred-thousand-seaF
hundredfold
hundredsb
hundredth
hunger
hungrier
hungry
huntZ
hunt-and-peck
hunt-the-backup-and
hunted
hunter
hunters
1024E
1024-bit
1024/1200
1024/1280
1024mb
1030V
10306
104-page
1040st
1048728192I
104ms
10500
1053V
1058V
106133.2372
10646
107.25pt
1072807676
107541
107kb
108's
1088I
10:00:86`
10:03
10:07:13
10:30
10base2
10baset
documentA
ecords set to one particular machine that acts as a mail gateway and forwards the mail on to the appropriate user.
Another example would be where sites agree to co-operate on mail handling and introduce a particular address to handle all incoming mail and prevent unauthorised connections to their SMTP port. This is a typical anti-spam measure for blocking the scum of the Internet from hijacking an SMTP port and using it to post drivel and scams to all the world and its postbox. What happe
ns is that networks are configured to accept only SMTP connections from specified hosts, rejecting anyone who attempts to mail into the system directly.
The MX records will point to where mail should be sent to get to that system, which will typically be a spammer-aware SMTP handler that blocks incoming flotsam. But the problem with SMTP is that it was designed in a gentler era when there was trust for others on the Net, so unfortunately it doesn
t make allowances for the hijacking morons
capableA
capella
capitall
capitalisation
capitaliseX
capitalised
capitalist
capitulate
capsule
captain
nightmareC
nightmares
nightmarish
nightprint
nights
nightwatch's
nikkor
nikoleJ
nikon
nikon's
nikontech-usa
nimda
nimda-infected
nimmo
nindex
obviousA
frustratedF
frustratingY
frustratingly
frustrationq
frustrations
fry's
frying
fs108
fs2000
fs4000us
fsbusiness
fsckD
fsdst
fsecure
fseek
fsinfo
fsizze
fsm726s
fsmooth
fso-compatible
fspectrum
fssrc
fsstayontop
fstep
fsutil
lownJ
full-colourd
experimentationE
experimentedI
experimenters
experimentingM
experimentss
expert_
expert's
expertiseK
expertsH
expertware
expired
e sometimes wondered how Unix systems know where to send mail that isn
t destined for a local user. In a simple system, this information can be hardwired so, for example, in one office I work from, all the mail - be it from sendmail or mail clients - is sent to the central mail server. The central server contains all the local mail knowledge and decides whether a message is held locally or forwarded on; there
s no name service used (for various good reasons). Once you venture out into that
vague cloud of connectivity called the Internet, the situation becomes more complex - and this is where DNS (and its MX records) come into the picture.
If an MTA doesn
t know what to do with a mail, it can ask DNS where it should send it. DNS will then send the message to the
record for the target host which will say,
If you want to send mail here, send it to this IP address
, even if it isn
t necessarily the same machine. For example, all hosts in an intranet could have their MX r
problem, where a MIME mail message may contain megabytes of multimedia content. By using IMAP4 though, you can look at the mail on the server and only pull it down to your system when you want to read it; it still stays stored on the server.
IMAP4 is more ambitious than POP3, as you can configure it to handle newsgroups and private mail discussions far more elegantly fashion than mailing lists and Usenet/NNTP style distributions. This is why a lot of intranet software developers are con
sidering it as an alternative approach to distributed groupware. IMAP has its own Web site (www.imap.org) which is run by the University of Washington where IMAP was developed. This site contains documents about IMAP4, comparisons with POP3, definitions and specifications, and an index of products that support IMAP4.
The weakest link
One fly in the ointment is that IMAP4 still relies on SMTP for mail delivery, and SMTP is still the weakest link in the chain. Let me explain...
You may hav
definingF
definiteS
definitelyB
degreesP
addressedF
addressee
addressentries
addressentry
addressesA
ed a need for remote users to connect to a system and collect their mail without being given shell access, and without having to run a mail program on the server. SMTP is unsuited to such a purpose though, because it can only push mail to another host, and despite the efforts of many clever people, you can
t build a reliable SMTP delivery system for dial-up users.
A modern Net application such as Netscape Communicator is set up to retrieve mail from a POP3 server by sending the user
s POP3
username and password; it then grabs back all the mail being held for that user. However, it does use SMTP to send mail, because POP3 only supports mail receiving. You
d be forgiven for thinking that this sounds like a dreadful hack-up, but it does work.
Even so, there are alternatives. IMAP4 is a more powerful way of managing mail remotely, allowing users to manipulate their mail on a remote server without having to pull it down onto their local system. This gets round the
massive mail
ritlabs.com" target="_BLANK">www.ritlabs.com</A>), will display HTML content if you ask it to, but won't connect to any third-party servers to download images or whatever. You are, however, still at the mercy of an itchy clicking finger, because links remain live so the average user is still vulnerable to 'semantic attack', a phrase coined a few years ago that sadly seems to have fallen out of use. Sadly, because if more people understood how such an attack is framed and the damage it can
do, a little more thought might be given to the way email is handled security-wise.
**A semantic attack relies on our instinctive desire to attach meaning to any content placed before us. Human nature inclines us toward believing what we are told, especially if we think it comes from someone in authority. This is the basis for spam hoaxes - these can be a nuisance, but are only damaging when coupled with instructions to delete system files. However, the Internet's 'something for nothing'
culture can be exploited to make a truly dangerous trap.
**So it was that I found myself at the office of a prospective small business client, which proved that a little bit of knowledge is a very dangerous thing. The firm had been having trouble with web browsing - highly problematic, given that its business was largely research led. Its default home page kept changing to that of a rather suspicious-looking site offering software for sale at prices way too low to be genuine, and its pr
oprietary search system had been hijacked by another of a far less salubrious nature.
**It was obvious to me that the system integrity had been breached. When I asked if there had been any other problems such as an unexpected increase in spam, subscriptions to mailing lists that hadn't been made by the small staff (three people), or slowing down of the system for no apparent reason, I wasn't too surprised to hear they had indeed suffered all these. A quick look into the setup revealed th
at the only firewall was the XP software default, Internet Connection Firewall (ICF). The boss, a computing enthusiast, had been assured by friends and colleagues that was all he needed to protect his network from the dangers of the outside world.
**'After all,' he explained, 'it's not as if my little company is of any interest to hackers, is it?' What he didn't appreciate is that his little company is of great interest. Any network with an Internet connection and little protection will
attract hackers like flies to the smelly stuff, but there's also the small matter of company accounts being stored on the network, potentially enabling business identity theft, not to mention their computing resources that could be pressed into illegal activity by a Denial-of-Service hacker.
**The dangers are plenty, the solution simple, yet even this self-confessed enthusiast hadn't merely missed the boat but had jumped into the water with rocks in his pockets. My only surprise was that
it had taken so long for him to suffer the inevitable consequences of such a lax approach to Internet security. It turned out he had only recently installed a new XP Pro-based system, the guv'nor taking it upon himself to specify, source and install it all himself - 'Why waste money getting somebody like you in to connect a few cables and configure a few bits of software?' Ahem.
**Not that XP ICF is a total crock; it just doesn't go far enough to be adequate protection on its own, even f
or a single computer connected to the Internet. The reason is simple: it doesn't offer any protection against rogue, outbound traffic. While it can keep intruders at bay, if you've not smelled the RAT (Remote Access Trojan) that has just been installed on your system and is sending data out to anyone with the appropriate control client, then neither will ICF.
**I hate to say this - because it's always good to see Microsoft taking steps towards making its OS and apps more secure - but my
honest opinion is that ICF needs either upgrading to do a lot more, or doing away with. Yep, I'd rather see no protection at all than poorly implemented protection that lulls users into a false sense of security. When users think they're safe, they adopt a more reckless approach to online activity; however, when they know they're vulnerable, they take a great deal more care over file attachments, HTML links and suspiciously generous download opportunities. It all goes back to that core hum
an-nature argument that enables semantic attacks to work so well.
**My newly engaged client had, in just over two weeks of system usage, been totally compromised by a combination of poor physical security and a lack of any acceptable use policy or even user guidelines. 'They're sensible people', I was informed, 'They don't need to be patronised or hand-held'. Yet, they'd merrily been clicking on HTML links leading to potentially useful research sites, so they thought, but all the while d
ownloading VBS scripts that were hijacking the Microsoft IE home page by simply changing the HKEY_CURRENT _USER\Software\Microsoft\ Internet Explorer\Main\Start Page entry in the system Registry. The reason they couldn't permanently change it back, despite the boss trying to every day, was that this script lived in their Startup folder to ensure pervasiveness.
**A little bit of training for the boss and his staff could have prevented the initial downloading, as well as the subsequent ope
ning, of the VBS attachment (or possibly ActiveX script). A software firewall such as the PC Pro A-Listed ZoneAlarm Pro is capable of blocking such attachments, isolating them using a quarantine function that makes it clear that opening the attachment could be dangerous. ZoneAlarm (<A HREF="http://www.zonelabs.com" target="_BLANK">www.zonelabs.com</A>) or Outpost (<A HREF="http://www.agnitum.com" target="_BLANK">www.agnitum.com</A>) are now very stable and mature products, in my experience
, and certainly preferable to the XP ICF offering. Not only would they both have prevented the home page hijacking problem, they'd also have prevented the RAT infestation, which I revealed by doing a quick online system scan using the free Panda ActiveScan service (<A HREF="http://www.pandasoftware.com/activescan/com" target="_BLANK">www.pandasoftware.com/activescan/com</A>).
**Their system was equally lacking on the anti-virus front, although once again Mr Littlebitofknowledge assumed e
verything was okay because there was an anti-virus scanner installed. Unfortunately, this was a freebie from a cover CD that had never been registered or upgraded to the pro version for commercial use and consequently never updated, so less useful than a chocolate teapot. Regular readers will know I'm an advocate of Sophos Anti-Virus (<A HREF="http://www.sophos.com" target="_BLANK">www.sophos.com</A>) because, while it isn't the cheapest solution, you get what you pay for - and, in this ca
se, you're paying for support and security. Virus identity files are released as soon as a new threat is established, with an email notification sent and installation of the IDE files required to update the scanner just a click away.
**Just as important, if you ever get virus-infected, there's a human on the end of a telephone line 24 hours a day, and they'll hand-hold you through the process of disinfection. That's something my new client would have found invaluable, and would have save
d a fair amount of money compared with my consultancy fee for sorting all this out at such short notice. By the time I'd finished, the company was equipped with a decent, low-cost, but perfectly serviceable router to provide NAT protection, backed up by a software firewall to deal with any Trojan/malware activity from within, plus anti-virus software fully registered (and the importance of regular updating drummed in). It took little more than 15 minutes to devise an acceptable-use policy
covering system integrity matters as well as the more obvious (worryingly absent previously) formalities over what uses of company resources are and aren't allowed under contract-of-employment terms. I threw in a 15-minute pep talk and training session for free - it was only common sense after all.
**The copycat URL
*One of the problems revealed by this anecdote is that fake URLs are often used in semantic attacks. Spammers and hackers are neither stupid nor technologically inept, and th
ey've known how to spoof URLs for years. The sad thing is that they're still doing it and punters are still falling for it - on the whole, people have absolutely no idea how a URL does what it does and are happy that it just takes them to some website.
**I'm not suggesting that everyone needs to read RFC1738 (<A HREF="http://www.faqs.org/rfcs" target="_BLANK">www.faqs.org/rfcs</A>), although PC Pro readers might benefit from understanding the specification that defines the URL scheme. Th
e first thing to understand is that an URL is a 'uniform' resource locator, so although most people associate it solely with websites it can address a much wider range of Internet resource types. Read that RFC, and you'll realise that an URL consists of a protocol name, a colon and then a string that gets parsed according to the particular protocol being used. In the case of a web page, the full Common Internet Scheme takes the form //<user>:<password>@<host>: <port>/<url-p
ath> with only the host section being obligatory.
**Not all web servers support URL requests in decimal format, demanding instead fully qualified domain names from the client (in the case of a virtual server setup, for example), but the vast majority do. This leaves a huge security hole just waiting to be exploited by a semantic attack on an unsuspecting user. You can't blame the average user for taking links at face value, especially when they don't understand the full URL that appears
in the browser address bar - these days we're used to seeing all sorts of convoluted and meaningless strings being served up on the fly from some clever back-end database. This only adds fuel to the fire, making it ever easier for those with malicious or criminal intent to exploit our trust and confusion.
**The very fact that URLs are easy to read - and even easier to misconstrue - makes for a potentially serious cultural issue that needs to be addressed before the problem will go away.
This isn't something that can be solved with a Microsoft IE patch (Microsoft itself has fallen victim in the past to people passing URLs off as pointing to the Microsoft Knowledge Base when they didn't), but rather requires an educational fix. If you embed a carefully constructed-to-mislead URL into an HTML email message and make sure the link is further concealed by using the anchor tag to display different text (maybe some URL the reader would expect), a good percentage of busy folks wil
l fail to absorb the real URL that pops up as a mouseover - and even if they did absorb it, it still contains the real URL string to further convince your subconscious of its authenticity.
**So how do you spot such frauds? By understanding how they're constructed. Store that information in your brain and there's a much better chance that you'll not be fooled by the fakers - a mental alarm will go off every time something doesn't look quite as it should. Believe me, I've tried this out on
numerous people and once they understand the fake-URL concept they spot them easily enough. Any URL containing the @ symbol should hoist your red flag and demand further investigation - here's the 'science bit' that explains why.
*There is, even among old hands, still much confusion about URL structuring and usage, which is why spammers, malware distributors and criminals out to actively defraud have been able to exploit the knowledge gap to their advantage. In the case of web spoofing, th
ey exploit the fact that any text placed before a @ symbol within the URL gets treated as a password for the website in question. The clever bit is that if the website referred to after the @ doesn't actually need a password (which the fraudulent site most certainly won't), the text just gets ignored and your browser happily sucks in the site as specified in the rest of the URL. For example, the URL <A HREF="http://www.pcpro.co.uk@3639555432" target="_BLANK">www.pcpro.co.uk@3639555432</A>
actually points to Google rather than the PC Pro website.
**Here's why. A simple lookup at SamSpade (<A HREF="http://www.samspade.org" target="_BLANK">www.samspade.org</A>) reveals that <A HREF="http://www.google.com" target="_BLANK">www.google.com</A> resolves to an IP address of 216.239.53.104, and armed with this information you can pop off to Geektools (<A HREF="http://www.geektools.com" target="_BLANK">www.geektools.com</A>) and use the dotted-quad-to-decimal conversion calculator t
o obtain Google's decimal address as 3639555432. If you're feeling truly nerdish, do these calculations yourself by applying the formula (216*224) + (239*216) + (53*28) + 104, or (216*2563) + (239*2562) + (53*256) + 104 if you prefer. How you arrive at the magic number is irrelevant, but once you have it you can use it to point a web browser at the site in question in a way few will recognise.
**Net Guru
*Every now and then, I get surprised by a bit of software sent to try out. Such was
the case when the people at Atomica sent me a copy of their GuruNet (<A HREF="http://www.gurunet.com" target="_BLANK">www.gurunet.com</A>) product. At first, I wasn't that impressed, as it appeared to be little more than a mini-meta search engine wrapped up in a fancy IE-based skin and costing money, whereas the search engines it contained could be queried free of charge.
**Indeed, if all you wanted was the search capabilities of Google (<A HREF="http://www.google.com" target="_BLANK">w
ww.google.com</A>), AltaVista (<A HREF="http://www.av.com" target="_BLANK">www.av.com</A>), Teoma (<A HREF="http://www.teoma.com" target="_BLANK">www.teoma.com</A>) and All the Web (<A HREF="http://www.alltheweb.com" target="_BLANK">www.alltheweb.com</A>), you'd be silly to cough up the requested $35 (
22) for a GuruNet licence. However, it comes into its own when you either type a keyword into the topic bar or use the <Alt+mouse button> - click over a word in a document, a web page or
just about anywhere on your screen and GuruNet will retrieve all the information it can find about that word from numerous sources and present it to you via a tabbed interface that expands depending on what references it has found. You'll get dictionary definitions, thesaurus alternatives, jargon explanations, economic implications, quotes containing the word and encyclopaedia entries. Translations to and from numerous (and configurable) language options, images found online and all the se
arch engine site are in there too.
**Not only is this approach pretty thorough, it's damned quick too. On a broadband link, your results will be back within seconds, and dial-up users won't wait much longer. I like it a lot and use it a lot. There's a small toolbar search facility that sits hidden at the right-hand bottom screen border for you to click on and enter a search term, and GuruNet is also available from right-click menus, plus the aforementioned Alt+clicks. In other words, it'
s always where I need it, and that makes me happy.
**Some of its thunder might be stolen when Office 2003 hits the streets, because there's a good online reference engine built into the Word Task Pane. Highlight and choose Reference from the right-click menu and the Task Pane retrieves and displays dictionary definitions, thesaurus alternatives and translations. The big question is, do you want everything in one place or are you happy having these online resources scattered around your a
pplications? Personally, I find I'm using both GuruNet and Office 2003 task pane references almost equally. They refer to different sources and, as a writer, it's always nice to have plenty of options to fill my mental gaps.
H@Protect your network from the outside world, shouts Davey WinderL
August 2003X
106 (August 2003)i
Online
Tom ArahD
Advanced acrobatics
When Adobe Acrobat was first launched on 15 June 1993, it was hardly noticed; a decade later, however, and Acrobat 6 is a major event. To appreciate Acrobat's importance, you have to realise that it isn't so much an application as a platform built around a file format: PDF, the Portable Document Format. And to understand what makes PDF so special, you need to understand where it came from - namely, out of Adobe's work on PostScript.
**PostScript is a PDL (page description language) that p
nearlyE
neat-looking
pagesC
consequencesG
consequent
consequent-improved
consequential
consequentlyG
conservation
conservatism
conservative
conservatively
conserve
consett
stereoE
stereo-3d
stereo-style
stereotypical
sterile
sterlingi
stingb
stingy]
stirred|
stitchingw
stockD
stockso
stolenG
stones
stonesI
stonking
stooda
stopB
stoppedC
stoppingC
stopsK
storageB
storeB
store'so
storedB
/sendmail, also nicknamed
sendmail heaven
. It will point you at sources and resources, and there
s a natty little CGI page that will let you enter basic sendmail configuration information from which it will produce the
sendmail.cf
file; this is the setup and control file for sendmail. Finally, there
s also the official sendmail page, www.sendmail.org
Sendmail is capable of much more than just supporting SMTP to transfer mail, including UUCP support for the less TCP/IP-enabled systems.
ve even seen sendmail configured to move mail by copying it onto a floppy disk, which was then conveyed by foot to another network, sucked in and delivered. Sendmail guruship is a hard-earned qualification, and I myself am just a mere dabbler in this blackest of Unix arts.
At this point you might be wondering about POP3, the Post Office Protocol. Another acronym associated with Unix mail, POP3 enables remote clients to pick up their mail. As the influence of the Internet spread, it spark
scannersD
scanningm
scanregw
scans
scansoft
scant
scanurl
scapegoat
scarceV
scarcely
scarcer
scare
scaredA
scaremongering
scaremongers
scares
scargill
scarier
scariest
scarily
scarlet
scarsF
linesE
lingo's
absolutelyB
absolutesolutions
absolutist
absorbG
absorbed~
abundance
abuseP
abused
abuser
abusers
abusing
abusivec
rightD
associatedA
associatedtype
associatesI
associating`
associationa
nyms crops up: SMTP (simple mail transfer protocol), which defines a way for MTAs to exchange mail. It assumes that a designated TCP/IP port, 25, will respond with a simple text-based protocol to pass on a mail message, and that this protocol is SMTP.
The MTA that sees the most use on Unix is sendmail, a topic on which entire books can and have been written. Sendmail is the cryptic programmer
s cryptic program, but if you can master it, it allows you to do numerous clever things with mail
routing. The definitive book on sendmail is sendmail, 2nd Edition (ISBN 1-56592-222-0, Pub O
Reilly) by Bryan Costales and Eric Allman (the author of sendmail), which covers everything you could ever need or want to know about sendmail. Spanning 1,050 pages, this tome gives you an idea of the complexity of sendmail is and its power - indeed, there are consultants out there who are successful and yet specialise in nothing else.
For on-line sendmail resources, go directly to www.jensen.com
invisiblyE
invitation
invitations}
inviteY
invitedt
invitesx
inviting
invocation
invocations
invoiceT
invoiceno^
invoicesT
invoicing
invokes
invokeattr
invokedD
mailF
guttA
gutter
gutters
guv'norG
guy's
guysO
gvox's
gwbert
gx-font
gxexplorer
gymnast
gymnastics
gymraeg
h>y
h0230z/a
h1>y
h1000
queriedG
queriesY
queryB
querying
queryrow
queryserver
querystringS
ithout a lot of ancillary work, it isn
t really scalable anyway.
Enter the @
s at this point that the @ sign arrives on the scene. A mail address such as sheridan@babylon5 simply tells you this message is destined for user sheridan on the machine babylon5. Easy enough just saying it, but putting it into practice demands two separate pieces of software: an MTA and an MDA.
An MTA (mail transport agent) knows how to move mail between systems, while an MDA is a mail delivery agent for a
local Unix system. (Ironically, the latter is the mail program I mentioned earlier, because all it needs to know is how to deliver mail locally.) If a mail message is destined for a local user, the MTA hands it straight to the MDA, addressing it appropriately.
Alternatively, the MTA will work out the start of a route to the mail
s remote destination and send it on to another MTA running on another machine, which will then repeat the process. This is where one of the more familiar mail acro
Paul Ockenden and Mark NewtonD
Up the Amazon
Some products are launched in a blaze of glory and lights. Microsoft products might even come with a soundtrack by David Bowie or The Rolling Stones. At the other extreme, we have the so-called 'soft' launches. No lights, no music, no drinks. In fact, no party at all.
**One of the quietest product launches recently must surely have been Amazon bringing its Web Services technology over to the .co.uk website. Web Services were launched on the US-based Amazon.com site in July last year as a
rogrammatically describes the contents of any printed page - including all vector graphics, bitmaps and fonts - so running the same code on any PostScript-enabled device will produce the same resolution-independent output. It's this programmatic nature that ensures designers can proof the same page layout on their PostScript laser printer as will be output on their bureau's imagesetter. Ten years ago, PostScript and its standalone EPS (Encapsulated PostScript) file format were already esta
blished as the underlying architecture for commercial printing, but Adobe realised that a PDL could be useful for more than just print.
**However, there was one major problem: look inside a PostScript file and all you'll see is reams of code - or, at best, in the case of EPS, a crude bitmapped preview. To fulfil its potential, Adobe needed to make PostScript viewable on screen. In fact, that was always the intention, but Adobe failed to persuade OS developers - with the exception of Stev
e Jobs' ill-fated NeXT - to license its Display PostScript technology, because it was expensive and too processor-hungry. Adobe's stroke of genius was to rework the print-orientated PostScript format into a superset of its functionality, optimised for on-screen viewing and compressed to within an inch of its life. This was PDF, a 'distilled' version of PostScript.
**And that describes how PDF files were created: central to early versions of Acrobat was (and still is) Distiller, a utility
that takes a PostScript print-to-disk file and converts it to PDF. In other words, PDF is created by printing the document into a viewable file. These days, Distiller can be bypassed, but the process is still built upon print-based output.
**This may be an awkward way to go about producing files, but it's the secret of Acrobat's success, since all applications can print and hence can produce PDFs - exact electronic replicas of their printed output. This PDL-based universality is unique
and was important in the early days of Acrobat, when incompatibility between platforms and applications was strangling the development of business computing. Acrobat's initial raison d'
tre was to act as computing's lingua franca, a universal middleman (as suggested by Acrobat's juggler logo) that enabled office computer users to collaborate whatever their platform and application. These were the heady days of the imagined 'paperless office'.
**As it turned out, the arrival of the Intern
et and the ever-increasing dominance of Microsoft Office and the PC combined to remove most of the problems that Acrobat was designed to solve. The technology might well have fizzled out were it not for the release of the free Acrobat Reader program, which, at a stroke, transformed PDF into a capable publishing format - authors could provide free-to-view, electronic versions of previously printed material at virtually no cost to themselves.
**The PDF-based software manual was born, forcin
g more users to install the Acrobat Reader, which encouraged more PDF publishing and so on...
*PDF became an important publishing medium in its own right, but it never actually reduced the need for traditional paper-based publishing - and there it had another obvious role to play. As a universal single-file format that can contain all images and fonts - and which, crucially, is based on PostScript - it's the natural medium for designers to deliver their commercial print work for output. It
's ideal for the bureaux too, as they only need to support one master application rather than every release of every possible publishing application.
**It's almost impossible to over-estimate the potential of PDF for the designer during the proofing/review process, as a final published electronic product and as the digital master for commercial print. However, there's a big difference between the drawing board and the real world, and the early Acrobat needed to change radically to begin
delivering on this dream. That's exactly what it has done with each release and update to the PDF file format.
**For collaborative working, the breakthrough version was Acrobat 3, which bundled the previously separate Exchange and Distiller programs and provided everything needed to take full advantage of the PDF format, including the Post-it Note-style commenting capabilities we all know. Version 4 added a bunch of new commenting tools and the Annotations panel for easier review, but its
main advance was the introduction of encryption to ensure secure collaboration. Version 5 rethought the workgroup review process by enabling efficient browser-based access to a single, centrally held master.
**Version 3 was also the seminal release for those authoring PDFs as their end product, as it showed how electronic publications could offer more than their paper equivalents. The Link tool made it possible to set up basic navigation within a document, while the Form tool took this i
nteractivity further. The Sound and Movie tools even let you link to WAV audio and AVI video files, giving your work a multimedia edge. Version 4 added little to Acrobat's authoring capabilities, but version 5 introduced content tagging, which was useful for preparing layouts that could be reflowed - essential for handheld viewing on the new Pocket PC Reader. To further enhance Acrobat's eBook role, Adobe also added a secure mechanism through which authors could charge for their publicatio
ns with the dedicated Content Server software.
**Surprisingly, despite its suitability, Adobe hadn't intended PDF to be used for producing commercial print - that was PostScript's job. Hence, the early Acrobat only supported RGB colour - it was only later that the crucial CMYK and 'Device N' support for other colour models such as spot and hi-fi colour were added. Further improvements were Acrobat 4's support for PostScript Level 3 and ICC-based colour profiles and version 5's introductio
n of the shared ACE (Adobe Colour Engine) and transparency handling.
**By the launch of version 5 in April 2001, Acrobat had changed out of all recognition, but no-one would claim that it was an unqualified success. Collaborating via PDFs still boiled down to the digital equivalent of covering the page with sticky notes and passing it on. PDF still offered little more than an electronic replica of printed output, with the huge drawback that no-one enjoys reading on-screen. Plus, for comme
rcial print its lack of colour separations and preflighting meant the user either had to turn to expensive third-party plug-ins or cross their fingers. The underlying technology might be unbeatable, but the bottom line was that trying to make sense of an annotated PDF was a chore, trying to read a PDF manual was a pain and trying to produce commercial print via PDF could still be an expensive nightmare.
**Does 6 deliver?
*Expectations were therefore high for Acrobat 6, but my first impre
ssion was disappointment, because the only hard information provided amounted to a few press releases pushing two cosmetic changes related to Acrobat's office-centred use. The application has been split into four versions: Adobe Reader for free viewing; Acrobat Elements for corporate use; Acrobat Professional for high-end designers and Acrobat Standard for everyone else. Second, creating PDFs has been made simpler. Yes, being able to select the new 'Convert to PDF' command from a DOC file
in Windows Explorer is more convenient, but the underlying process is still fundamentally the same - the command simply opens the file in Word, where the PDFMaker macro takes over to distil the document.
**Convenience isn't to be sneered at, however, as the new collaboration features in Acrobat Standard and Professional show. Acrobat is stuck with its Post-it-Note approach to annotation, because the whole point of PDF is that it shows a fixed representation of the page layout, but the ne
w tools, review management and especially the redesigned Comments panel certainly make this process much smoother (see issue 105). I'm still not convinced the office users at which Acrobat Standard is aimed wouldn't be better off using Word's own review capabilities, as these allow suggestions to be automatically incorporated. For designers, though, Acrobat's page-layout based commenting system has always been invaluable for speeding up the proofing process, and Acrobat 6's major overhaul
will be especially welcome when deadlines are tight.
**The main benefits for the designer arrive in Acrobat Professional, aimed at commercial print production, so my first move was to load up Distiller and see what new power was available. I was surprised to find that the highest 'Press Quality' preset still defaults to Acrobat 5/PDF 1.4 compatibility.
**However, by forcing the compatibility setting to Acrobat 6/PDF 1.5, I was able to discover a couple of new features - notably support
for greater object-level compression and JPEG2000 image compression - but while slightly smaller file sizes are welcome, they're hardly revolutionary.
**I thought I might have found the new power I was seeking in two new Acrobat presets enticingly named PDF/X1a and PDF/X3. There's a whole new tab in Distiller's PDF Settings dialog dedicated to handling PDF/X features such as the document's TrimBox, MediaBox, and BleedBox and trapping, but in fact none of these features are new, since both
PDF/X presets output to the Acrobat 4/PDF 1.3 format. In fact, the PDF/X1a and PDF/X3 presets are simply codified standards for CMYK and colour-managed prepress exchange respectively - ideal for handling adverts.
**Distiller 6's PDF/X support demonstrates not Acrobat's new power but its new concern for quality control, which is even more apparent in the main Acrobat Professional application. Previously, Adobe took no interest in the quality of your PDF master: if there was an RGB image m
ixed up with the CMYK images or a low-resolution image, that was your problem and you'd only find out about it when your print-run arrived. Now there's a Preflight command that checks your file in exhaustive detail, listing all potential problems.
**Preflighting is essential, but by far the most important check of all is to output the colour separations yourself for proofing. Bizarrely, this wasn't possible in Acrobat 5 without using expensive third-party plug-ins - fine for the bureau, b
ut beyond the reach of the average user. There were workarounds - such as outputting each page from Acrobat as an EPS and separating those, or even pre-separating your files and sending publications composed of separate cyan, magenta, yellow and black pages - but clearly this went against the whole principle of the streamlined composite PDF workflow. Now, at last, you can output separations to any PostScript device and, even better, preview each plate on-screen.
**Acrobat 6 is a major ste
p forward in terms of collaboration and commercial print, but by exploiting power that was already there rather than through new functionality. It's a sign of Acrobat's increasing maturity, but I was beginning to wonder if there was anything new in the PDF 1.5 format (a feeling reinforced by just a single reference in the Help file).
**Producing PDFs as an end product seemed a similar story: the authoring tools for adding value to the final PDF have been augmented rather than revolutioni
sed. The Forms tool is more accessible and now supports styled text and better JavaScript handling. And, although Adobe doesn't deem it worthy of mention, the much-neglected Movie and Sound tools now support far more media formats, including MP3, MPEG and even SWF, if the user has the necessary players installed. The Tags palette offers more power and can be used to control Acrobat's new screen-reading capability via the new Read Aloud command. In each case, the tools are more powerful, bu
t I can't imagine that many people using them.
**In a way, though, that's the point: retrofitting interactivity and multimedia interactivity to an existing PDF doesn't make sense. The place to do it is in the original authoring package, and that's the direction in which Acrobat is moving and where the new power is hidden. The universal nature of Acrobat's print-to-PDF capability has been the backbone of its success, but it only goes so far. A print stream is inherently concerned only with
the appearance of a document and all other information is lost. But if you create the PDF directly from the application, this limitation is instantly removed.
**The first hint of these new, richer PDFs that you can create directly lies in Acrobat 6's support for layers. At the launch, the only indication of this new capability was the provision of PDFMaker macros to enable AutoCAD to produce layered drawings, but it's safe to say that Adobe has other applications in mind. In particular,
of course, each of its three main publishing applications - Photoshop, Illustrator and InDesign - make heavy use of layers. For InDesign, and to a lesser extent Illustrator, the benefits of being able to hide and display PDF-based layers are fairly obvious, such as the ability to include different language versions in the same file.
**For Photoshop, though, the use of layers is really only important while editing, and it's here I think the new PDF 1.5's richer structuring capabilities ar
e truly aimed. For years, Adobe has talked up the benefits of an all-PDF workflow, and each of its publishing applications can now save directly to PDF. However, users have stuck to the native file formats AI, PSD and IND to maintain maximum editability for their original work and the file standards EPS and TIFF for maximum compatibility when it comes to cross-application exchange.
**Gradually, though, all Adobe's publishing applications have been converging on this shared and editable P
DF format and it's clear Acrobat 6 will be another major step on the way.
**But why stop there? The very fact that Adobe so studiously avoids mentioning Acrobat's new multimedia capabilities piques my interest. I can't imagine many users using the Sound and Movie tools to retrospectively add multimedia to an existing print project, but if the capability is there from the beginning in InDesign it's a different story.
**Forget about Acrobat's new Read Aloud command - the average user won'
t want to listen to Microsoft Sam for long - but how about providing an MP3-based narration alongside the text in a combined Talking/eBook? Or how about adding explanatory Flash animations to your projects? Or what about replacing your publication's screenshots with video walkthroughs? Maybe users will eventually groan when their software manuals are provided on boring old paper rather than PDF.
**And why stop there? The beauty of Acrobat 6's new multimedia capability is that it doesn't
provide the media player, leaving that to the experts - Microsoft, Apple, Macromedia - and to the end user. The PDF simply acts as an intelligent delivery mechanism, providing the media best suited to the user's current setup. But if it's simply a wrapper, what's to stop Premiere wrapping its sound and video files in a PDF? The record industry is looking for a secure way to add chargeable value to MP3s, and wouldn't you be willing to pay a premium for a PDF album that includes lyrics, phot
os, videos and so on?
**Or what about DVD? These days, DVD specification looks terribly rigid and outdated, and wouldn't it give Adobe's new DVD Encore application a much-needed edge if it could also output projects to a much more modern and powerful media delivery format that automatically adapts itself to the user's playing environment?
**This vision of PDF as the all-singing, all-dancing, all-encompassing creative file format is pure speculation, and I don't suppose that this generat
ion of Acrobat will fully deliver on this expanded dream of universality any more than previous versions did. Having said this, I'm sure Adobe is moving in the right direction. With Acrobat 6, PDF both builds on its secure PostScript-based page description foundations and moves beyond them to become an even more powerful, general-purpose design format. Perhaps even the design format.
HKTom Arah investigates the design potential of the latest version of AcrobatL
August 2003X
106 (August 2003)i
Publishing/graphics
) his mailbox. Deleting a message is.a matter of the mail-reading program editing the file.
This is true of most Unix mail software: it manipulates a user
s mailbox file and forwards outgoing mail to the mail-sending program. I
ve found that the ELM and Pine mail-reading programs tend to enjoy the most popularity, mainly because they
re character-based and allow people to Telnet into my system to read their mail. But if that
s Unix mail proper, then what
s the rest of the stuff people call
Unix mail
Well, mostly it
s software for sending mail between users on different Unix systems. As soon as the need arose for remote Unix systems to exchange mail, that simple write-to-file delivery mechanism was no longer adequate. Remember, this was in the dark days before networked file systems were commonplace, so the simple trick of unifying all the mailbox files into one shared directory then letting all the local machines see that share had yet to be realised - not to say that w
programmeB
programmedM
programmerM
programmer's
programmer-defined
programmer-friendly
programmer-orientate
paperlessH
papersQ
paperwork
papyrus
paradem
paradigms
paradigms
paradise
paradox
paradoxically
paragon
paragon's
paragraphA
parametricv
paramounte
paranoiaF
paranoidB
parapet
paraphernalia
paraphrase
paraphrased
paraphrasing
unsignedA
unsized
unsociable
unsolicitedG
unsophisticated
unsortedx
unsound
unspeakable
unspecifiedC
unspectacular
unsplitting
unstableA
unstoppable
unstructured
unstuck
unsubscribe
unsubsidised
unsubstantiated
unsuccessful
unsuccessfully
unsuitable
unsuitedS
unsupervised
unsupportedO
ethered
unthinkable
unthinking
tify its status by owning the program file and letting other users read and execute it (but not write to it), and by setting its permissions to include the SUID bit. When the program runs, it can ask to become the user that owns it, so if lorien runs the mail program, providing the mail program is owned by root and has the SUID bit set, it can become root for long enough to write into sheridan
s mail file. This privilege should be closely guarded though - a badly-written program that makes
overly free use of the SUID facility will probably generate security holes (it
s one of the commonest causes).
Back to the mailbox, where there
s a mail message delivered to sheridan. Sheridan can read his mail by running a mail-reading program which looks in the mail folder and turns it into that familiar list of headers and senders folk associate with mail readers. Sheridan
s mail-reading program doesn
t need super-user access because sheridan owns (and is the only person with access to
suggestionsH
suggestive
suggestsE
pdfmakerH
pdfoutput
pdfsH
pdfwriter
pdfzone
pdl-basedH
pdp11
pdqmail
pdqweb
peace`
peaceful
peacefully
peacemaker
peach
peacock
peculiarity
pedal
pedantic
pedantically
peddling]
pederson
peekZ
peekingA
peeled
peerL
peer-based
peer-to-peerm
peerid
peering
peerless
peersC
peg-n710
peg-n710c
peg-n760
peg-n770
pegasus
pegging
pembrokeshire
ating system, now long vanished into history). Then the bods at Bell Labs decided to add multiple users to Unix, and naturally they also wanted a facility to send mail between those users sharing the same system.
Their chosen solution was simplicity itself: when user sheridan wants to send mail to user lorien, sheridan runs the mail program, citing lorien as the recipient, keys in his message which the mail program then appends to the end of lorien
s mailbox. A mailbox is no more than a f
ile owned by its user, readable and writeable only by that individual. These mailboxes typically (but not necessarily) reside in a directory called /var/mail and are named after the user, so lorien
s mailbox would be /var/mail/lorien.
So how does one user
s mail program manage to write to a file owned by someone else? Answer: by using the Unix facility that lets a program take on the attributes of another user, namely the super-user or
. The super-user has to mark the program to iden
becameH
acrobatH
acronym
acronyms
across
gives
latest
compliance
issue
actinic's
action
active
activex
activities
activity
add-ins
add-ons[
addiction
addins
additionsh
address
addresses
adjudicates
admin
administration
administrators
admits
adobe
adobe
looks
future
might
adobe's
adslY
advanced
advances
advantages
advent
advertisers
advice
advises
affair
affect
affordable
afterP
Jon HoneyballD
Plus DME
I can't contain my excitement - Microsoft has launched the most important product of the year and it's just turned up at my office on a single CD accompanied by some documentation. Forget Server 2003, Office 2003 and anything else with 2003 in its title. This baby is gonna do the business and will ensure the sort of solid revenue stream and user excitement that every ambitious product manager dreams of at night.
**I am, of course, referring to the Windows XP Plus! Digital Media Edition (D
free facility, allowing developers and website owners to build applications and tools that incorporate Amazon's interfaces and content into their own sites. These services enable your users to search and browse Amazon products and features just as if they were running on the Amazon site itself. The only real restriction is that you must provide 'Buy now' buttons - you can't, for example, show Amazon's reviews of a book without also allowing Amazon the opportunity to sell it to your visito
rs, which is only fair, as it's trying make a profit.
**This Amazon Web Services magic is achieved by a set of self-contained applications that can be published and invoked across the Web using XML or SOAP-based protocols. Both methods return the same structured data (product name, author, publisher, price, sales ranking) that you'd find on the Amazon.co.uk site, and you can drill down further by supplying parameters such as keyword search terms and browse tree nodes.
**The XML and SOAP
interfaces enable you to grab product information directly from Amazon's servers and format it in any way you want - if you have a fluffy, pink site, you can present Amazon's books and CDs in a fluffy, pink way.
**There are two possible income streams from Web Services. First, the scheme runs in tandem with the existing Amazon Associates programme so you make up to 15 per cent on all sales you generate. Amazon isn't just about books and CDs these days - you can start selling anything from
cameras to sandwich toasters from your own site using Web Services.
*Second, there's an emerging market for people to create systems or services that enable other site owners to plug into Amazon's data. One of the coolest third-party applications we've seen is the 'Yes Bar', which uses the .com rather than the .co.uk Web Services, but what it does is brilliant - it examines the 'now playing' data for thousands of American radio stations and provides a simple one-click opportunity to buy
the CD currently being played. Look at <A HREF="http://techno.starcd.com/proto/yesbar-signup.cgi" target="_BLANK">http://techno.starcd.com/ proto/yesbar-signup.cgi</A> for more details.
**If you want to have a play with Web Services, the first step is to download the SDK, which contains the technical documentation as well as lots of code examples (for both the XML and SOAP interfaces). You'll find it at <A HREF="http://associates.Amazon.com/exec/panama/associates/join/developer/kit.html"
*Once you've read and digested the documentation, you'll need to apply for a Developer's Token at <A HREF="https://associates.Amazon.com/exec/panama/associates/join/developer/application.html" target="_BLANK">https://associates.Amazon.com/ exec/panama/associates/join/ developer/application.html</A>, which confirms your agreement to the Amazon Web Services terms and conditions and allows you to
query the Amazon servers.
**As soon as you have the SDK and token, you're ready to rock. If you were using the XML interface from ASP, your code might look something like Code Listing One. This example is heavily based on one you'll find in the SDK, but our version searches for one particularly cool book.
**More on FrontPage 2003
*We said last month we'd investigate further the data-handling capabilities of FrontPage 2003, and we're keeping that promise. After borrowing a spare box from
a colleague, Mark proceeded to install Windows Server 2003, which was released a few days ago. Then he searched the MSDN members area of the Microsoft Developer Network site to find a copy of SharePoint Server 2, which, despite being a server product, wasn't in among previous versions of SharePoint but in the area for Office 11. We're sure there's a good reason for hiding it there, but at a loss to know what. After this 350MB download, SharePoint 2 - which will only run on Windows Server 2
003 - was installed.
**Configuring it was a bit like walking through a dark wood wearing sunglasses: lots of new terminology and a struggle to guess whether it's the website, the virtual server or the SharePoint server part that needs configuring. It all went without a hitch in the end and so, sitting at the client workstation, Mark fired up FrontPage 2003 and created a SharePoint website using the Wizard. The first thing one notices is that the data option on the menu bar now appears, a
nd clicking on it opens a data view pane in the design area. From here, you can connect to different data sources, which may be SharePoint Lists or Libraries, database connections, server-side scripts, Web Services and even XML files.
**Before continuing, it's only fair to mention that FrontPage 2003 is still in beta. While there were some small problems, it generally worked well. Mark was very impressed with the easy way XML data sources could be 'consumed' and their data formatted.
To access a Web Service via FrontPage 2003, you simply click on the 'add to catalog' link and give it the URL of the Web Service's WSDL (Web Service Description Language). After a short delay, the properties of the Web Service appear - its name, operations or methods and any parameters they might take. Clicking OK adds this Web Service to your data source catalog pane, and now you can use it by dragging it from this pane onto a page in your site.
**The data generated appears preformatted
on the page using a default layout, which you can change for one of several others provided, or edit its look just as you would any other object in FrontPage. This works well and produces an XML style sheet that can be edited either by altering code or - more innovatively - from within the graphical design environment.
**That's fine for consuming a Web Service, but what if you have an XML file and want to design a XSLT so that it can be displayed in a more attractive form? Again, the te
chnique is to add the file to your data catalog and drag it to your web page. Mark chose a large XML file of Microsoft bug patches, which was presented as a table with a fixed number of rows with next and previous buttons. The configuration box also offers filtering and sorting of the XML dataset, the result of which could be seen live in the design pane in FrontPage.
**When it comes to creating XSLT files for consuming XML data sources, FrontPage rocks. It's a shame it requires such a h
eavyweight back-end server overhead as Windows 2003 and SharePoint. The reason may be to provide the sort of capabilities that aren't possible with a standard 'web server/HTML' setup, and perhaps we're starting to see a migration to this way of working. In the past, we've had Apple's Web Objects and Macromedia's ColdFusion - now it seems we're going to get Microsoft and SharePoint. This could make for some very interesting design environments supporting fast development, full debugging and
XML support. However, because of the costs and complexity, these will probably remain the preserve of the professional, and the divide between what the professional can achieve compared with 'one man and his Dreamweaver' may grow enormously.
**Mode of the Music Changes
*Every once in a while in the world of web development and e-commerce, there comes along something new that's of such ground-shaking importance that everyone involved in the industry is forced to sit up and take notice. We
make no apologies for repeating news you may have read elsewhere: it's that important.
**So what is it, we hear you clamour? What if we said you could go to a website that sold music and preview the first 30 seconds of any of thousands of tracks, then select some and pay 99 cents per track (about
6 for a full CD-worth) using a one-click system. These music tracks aren't protected, so you can use them anywhere. Interested?
**If we then told you the company offering this facility was n
o less than Apple, your interest might increase a little further. But you may think the site would only hold old or obscure tracks with no sign of the latest artists or their albums - a bit like a huge online compilation album - as is so often the case. However, when we tell you that this enterprise is supported by the top music companies - BMG, EMI, Sony, Universal and Warner - and that there are more than 200,000 tracks already available, many of them new artists and albums, you'll start
to see how this could easily be the turning point in the way music is distributed.
*It's also the first large-scale recognition of music distribution via the Internet by the music companies themselves and paves the way forward for low-cost, legal digital music. Some people will still copy, but by offering a great service at a good price this enterprise hopes to minimise theft. The music is offered only in AAC audio format, based on the industry-standard MPEG-4 format developed by Dolby,
which apparently offers better sound quality than MP3 for the same file size.
**Apple has called this 'life-altering system' iTunes (<A HREF="http://www.apple.com/itunes" target="_BLANK">www.apple.com/itunes</A>). It also has a local streaming technology called Rendezvous that lets you share your music over your network and possibly - as it's rumoured to be TCP/IP-based - over the Internet.
**Before you rush out to buy music from the iTunes website, there are a couple of things to bear
in mind. Currently, the iTunes system will work only on Apple Macs running OS X, but a Windows version is planned later. Also, the online shop currently works with US billing addresses only, but this will expand as the system develops.
**Incidentally, iTunes also offers 'speaking books', a great boon to people with poor sight, via the site <A HREF="http://www.audible.com" target="_BLANK">www.audible.com</A>. As poor-sighted users might be interested in this, Mark tried to navigate using
a screen reader that reads web pages to you, but found the site very confusing when browsed this way. Designing websites that can be read by people with disabilities is a hot topic at the moment, as legislation will soon make it compulsory for all official sites. In our experience, it requires a little more thought than just making sure your graphics all have <alt> tags. Download a screen reader like the one from the IBM site (<A HREF="http://www-3.ibm.com/able/hpr.html" target="_BLANK"
>www-3.ibm.com/able/hpr.html</A>) and try navigating your favourite websites by sound alone, and you'll soon have even more sympathy for those who have to use this form of navigation.
**Cookie D'oh!
*Most web developers will have experimented with cookies at one time or another. Although they're becoming less useful on public-facing sites, due to the proliferation of 'personal privacy' cookie-blocker software, they still have a place on intranets and other 'closed community' systems. The
y can be used for all manner of tasks such as automated logons, saving user preferences, or even rudimentary shopping basket systems.
**Cookies come in two flavours called 'Persistent' and 'Per-session'. Persistent cookies store information on users' machines between visits to a site for a specific length of time. This expiry value is set by the website when it sends the cookie to the browser. Per-session cookies, on the other hand, are only used to store information for a single visit a
nd are deleted as soon as the session is closed. These non-persistent cookies are frequently used by server-side scripting technologies, such as PHP or Microsoft's ASP, to maintain 'session state'.
**Working with cookies from within your web application is easy. Either from server-side or client-side scripts, or even using a mixture of both, simply use document.cookie to set and retrieve values client-side, and response.cookies (ASP) or setcookie and $cookie (PHP). If you need further he
lp or examples, Google will oblige.
**The problem most web developers have is with the cookie files themselves, especially Internet Explorer cookies. You'll find your IE cookies in a folder somewhere in your Windows or Documents and Settings folder, the exact location depending on which version of Windows you're running and how it has been set up - search for a folder called 'cookies' containing a load of TXT files. As this file extension might imply, the cookies are held as plain text fi
les, but their format is quite obscure. Open one up in Notepad and take a look: here's an example from <A HREF="http://www.streetmap.co.uk" target="_BLANK">www.streetmap.co.uk</A>:
*ValueClickMS5
*true
*www.streetmap.co.uk/
*1088
*1048728192
*29559648
*346684688
*29559447
**<p>
*Although some bits of the data are readable, there's no way you can work out things like expiry dates with the naked eye. The actual fields are as follows:
*Cookie name = ValueClickMS5
*Cookie value = true
*Site na
me = www.streetmap.co.uk
*Path = / (note that the site name and the path are concatenated)
*Flags = 1088
*Expires = 1048728192, 29559648
*Created = 346684688, 29559447<p>
*This is followed by an asterisk and, optionally, by further cookie values for the same site. The date and time format used within cookie files must have been designed by someone under the influence of mind-bending substances - the first number is seconds to the power of minus seven (really), but things get stranger stil
l. The (seconds)-7 counter rolls back to zero every 429.4967296 seconds, at which point the second number gets incremented! Equally bizarre, the base time for this second field (ie, when it was last zero) lies in the 16th century...
*Let me just repeat that in case you didn't believe it - according to Internet Explorer, time started in the late 1500s and is measured in 7.158278826 minute units. Time for compulsory urine testing at Redmond?
**Given the examples above, how many readers ca
n do the mental maths to work out the expiry date and time of that Streetmap cookie? I'm guessing not too many. There is, however, a really neat (and free) solution available. Pop over to <A HREF="http://www.digital-detective.co.uk/freetools.asp" target="_BLANK">www.digital-detective.co.uk/freetools.asp</A> and download a copy of CookieView. This simple little utility is brilliant. Don't be put off that it looks a bit like a console window - if you drag and drop a cookie file onto it, it w_
ill decode the contents. From there, you can simply copy the contents into an email, a document, or anywhere else you need to. It's a brilliant, simple tool and worth every penny it doesn't cost.
HNPaul Ockenden launches his own bookshop, while the earth moves for Mark NewtonL
August 2003X
106 (August 2003)i
Web Business
excelB
excel's^
excel-alike
excel-vba
excel-wielding
excel9
excel97odbc
excelbook
excelledZ
excellence
streetsG
streffectivedate
strenc
strength[
strength-sapping
strengthen
strengthened
strengthening
strokesn
strolling
strongF
strong-armingk
strongeri
strongestk
strongly{
strove
strpwd
strsqlo
strsql2{
strtag
strtrack
struckE
struct
structuralO
structureD
submitf
keptE
kerberos
kerberos-based
kernaw
kernelA
kernel's
kernel-hacking
kernel-hang
kernel-level
kernel32
spareI
spare9
sparedl
sparesb
sparingly
spark
sparkedC
sparkling
sparse
sparsely
sparta
spartan
spate
spatial
spattering
cular
spectral
spectrumd
speculate
speculatingT
speculationH
speculativeY
speech
speedC
calibrated
calibrates
calibration
calibreT
california
california-based
californian
callB
space-inefficient
16-bitE
16-bits
16-bits/channel
16-byte
16-channel
16-colour
16-fold
16-page
16-pixel
16-port]
16-speed
16-voice
16-way
16.5mb
16.95
16.99
16/10/40
160-plus
1600M
1600's
16450
16550
16550a
16650
166mmx
16750
168-bit
16:52
space-efficient
space-hungry
space-inefficient
space-saving
space-time
space-waster
ME) add-on. This is a new Plus Pack that's aimed at 'emerging digital lifestyles' among other such egregious marketing concoctions. I can feel you disgustedly backing away from these words and getting ready to turn the page, but please indulge me for a few more sentences.
**This pack epitomises the modern compelling tool suite, and there's no way I can avoid starting off with its cr
me de la cr
me, a treat that alone is reason enough for anyone who doesn't yet have a computer to buy one j
ust to run it.
**I am, of course, referring to the Plus! Dancer application. This breakthrough piece of coding places one of 11 dancing people on your Desktop: their moves are incredible, fluid and quite hypnotic. Naturally, each one possesses not only a name but a potted biography.
**For example, Amanda specialises in House music and apparently 'this 19-year-old, full-time college student is also a dancer, coach, choreographer, instructor, business owner and part-time travel consultan
t. She has been dancing professionally for 14 years, with formal training in ballet, tap, jazz, lyrical, modern, step, cheer and hip-hop. Amanda, who started teaching dance at age 14, is currently a co-owner and artistic director of a dance studio. She is also a dance choreographer for several award-winning cheerleading teams. Amanda was a member of the dance group for a professional basketball team during the 2001-2002 season.' Need I say more?
**Then there's Cobey who specialises in hip
-hop, Seth who does B-Boy, and Kenny who loves disco. Chanel is into funk and Ben does The Sprinkler (sounds vaguely rude). Not content with just one dancer? Then why not push the outer envelope of your graphics card performance by having two dancing together? Nikole and Marcelo do Salsa, Evan and Michele like to Tango, and Jen and Dave are masterful at Swing. My mind is just buzzing with the possibilities here, and I feel strangely compelled to go on and tell you more. Apparently, the dan
cing is supposed to be synchronised to the music, though I'll confess it looked a bit random to me. I guess playing a CD of the music of Tibet, especially the deeply rocking number called 'Singing Bowls of Tibet', might not have helped much. Ah well.
**But let's move on. The Plus! Photo Story application allows you to take a pile of images and record a voice track over the top. You can even drop in a music background and have the images move slowly around on screen in a desperate attempt
to make them more interesting. Naturally, there's a full set of tools to prioritise the various wipes and moves, and once you've created your masterpiece you can output it in Windows Media format for all the family to enjoy. Never before has the humble holiday snap been treated to such a makeover.
**Since you're no doubt already a big fan of Windows Media Player, the Plus! Party Mode add-on will be a godsend. This allows you to re-skin the Media Player so that it can run full screen. You
can also lock down the application so that your friends can select the party music on-screen without getting access to your Desktop - just what the world needs now.
**The Plus! Analog Recorder lets you stream audio in from your grandfather's gramophone via your computer's sound card in order to convert all those old-fashioned vinyl LPs and singles into modern Windows Media files. It can even de-scratch and de-hiss these recordings, removing the pops and clicks that such antique technolog
ies accumulate.
**The Plus! Audio Converter lets you convert the formats of your media files. You can't, however, go straight to MP3 without obtaining a third-party encoder, but Microsoft had to leave room for its third-party development community somewhere.
*You can print labels for your CDs using Plus! Label Maker, wake up to music with Plus! Alarm Clock, fall asleep to music with Plus! Sleep Timer, and spend countless hours fiddling with eight new skins for the Media Player itself.
Since by now you'll have become bored with still pictures, despite the stunning new capabilities of Plus! Photo Story, you'll be gagging for the new transitions and effects that plug into Windows Movie Maker 2. And finally, you can enjoy this multimedia bonanza from your Pocket PC PDA by using the Plus! Sync & Go tool.
**This Plus! DME experience is so overwhelming that I'm now lost for words. Dammit, my sarcasm gland is beginning to hurt, so the truth must be told. Well, I can't deny th
at Plus! DME is hardly expensive at
15.99, but quite frankly it should have been a free download (Redmond ought to take more notice of the 'digital lifestyle hub' work that Apple is currently doing).
**Microsoft should feel deeply ashamed for charging its users - the XP Home brigade one presumes - any money to install this collection of polished but ultimately vacuous distractions. A few parts of the kit are going to be useful and worth having, while others like the dancers are utterly
cringeworthy. Is this the best Microsoft can do?
**MS buys VMS
*Well, actually, no it isn't. Thankfully, Microsoft does still notice some important trends and needs of real users, and comes up with some truly trend-setting technologies. One that has piqued my interest is its recent acquisition of the Virtual PC technology from Connectix. I've been using Connectix's VPC emulation of the Intel processor for some time on my Apple PowerBook, which lets me open a window and boot any Intel OS t
hat I want right there on my Desktop. Care to fiddle with Windows XP Home? No problem. Windows Server 2003 as well? Why not. The only limitation is the amount of RAM your real computer contains, because booting full-blown OSes like XP isn't something you can squeeze into a couple of megabytes. It's worth remembering that Virtual PC runs on other platforms too, including Intel itself, so you can run different OSes in virtual machines on your Desktop without the hassle of shutting down and r
ebooting.
**This is a mature technology too. On my Mac OS X installation of VPC, I can control things like the virtualised networking settings, the way it interacts with hardware like USB ports, and how it maps through into the hard disk space both locally and remotely. And now Microsoft has bought it all.
**The obvious question is 'why on earth would Microsoft want to own a technology that allows you to run virtualised PC systems' when it has always had OSes that just booted on the har
dware anyway? Well, it's becoming clear that Microsoft has two main intentions with this purchase, and both of them are fascinating.
**Let's start with the forthcoming Virtual Server product that Connectix has been hinting at and which is in late beta at the moment. This allows you to boot a server OS into a virtual machine running as a task on your existing OS. Microsoft has a real need for this product as there are still a lot of servers out there running NT 4, and the hardware and driv
er support for NT 4 is now almost frighteningly out of date with the reality of the hardware shipping today. Don't forget that NT 4 first shipped in 1996 and here we are seven years later.
**Back then, a 486 was considered amazing and a few of us were pushing the limits with motherboards running twin Pentium processors at 90MHz. Today, we're about to move out of the multigigahertz Xeon world into a new 64-bit arena. Large IDE format was a dream back then, while technologies like Fibre Ch
annel and Gigabit Ethernet were future fantasies.
**Given that some of these NT 4 systems are still running happily, Microsoft needs something that permits an NT 4 image to be snapshotted and then run on more modern hardware. It isn't just that the old server was slow, but there's a real problem with driver support and hardware reliability. What are you going to do when something breaks on that old 486-powered server? It's not a nice thought.
*There's another reason too for Microsoft t
o want this Virtual Server technology. Given that modern servers can have an almost unlimited amount of power in them, there's a lot to be said for running server services within a virtualised OS that's run over a real base OS. Yes, the base OS will have to simulate the processor that's used in the virtual process, but it can do that very quickly. Being able to move virtual servers around on top of a hardware farm is the ultimate in flexibility. And with the move to standardised and sensib
le interconnection between server services - thanks to the emergence of SOAP and XML protocols - doesn't it make good sense to virtualise each server service into its own hermetically sealed virtual machine?
**The second of Microsoft's intentions is equally fascinating. If you're developing an application on your desktop machine - which is probably running Windows 2000 Professional or XP Pro - you might need access to Server running Exchange 2003 or some strange configuration of Internet
Information Server mated with Oracle and a big SAP installation. Rather than going out and hitting the hardware itself, why not debug against a virtual machine running within your own machine? This is especially powerful if you want to debug between multiple versions, or even between different OSes.
**Microsoft has already demonstrated debugging a Pocket PC application running on Intel's ARM processor on the same machine as the Visual Studio development system itself. It just boots Virtu
al PC with the right sort of processor architecture in place, then boots a real, standard copy of Pocket PC and squirts in your application for running and debugging. The same applies to a smartphone - boot it and work with the same code base and installation as if you were dealing with a real, physical smartphone, only this one exists here on screen.
**I think this way of working is going to have a major impact over the next few years, especially once you realise that the core Windows W
in32 API is being pushed into the background for the forthcoming Longhorn release of the Windows desktop OS (which is likely to be called Windows 2005). The main run-time and interfaces will all be managed code generated from the Visual Studio .NET development system.
**Obviously, not everything is going to be converted to managed code in the next 12 to 18 months, but C++ is deliberately being pushed into the background by Microsoft, and this will be true of all desktop and server apps.
Core OS components, especially device drivers, might well need to remain written in C++ or handwritten machine code for some time to come, but the move to managed code promises to deliver a more stable future where applications are deliberately kept separated by the common run-time engine.
**Office 2003 bundling
*One thing has just struck me about the Office 2003 launch. Microsoft has changed its branding style so that everything is now 'Microsoft Office' and then the individual product
name. So Word is 'Microsoft Office Word 2003' and Excel is 'Microsoft Office Excel 2003'. Now go and take a look at the Start/All Programs menu. Notice how all the Office XP or Office 2000 applications entries have disappeared? See how there's now a Microsoft Office menu folder and this contains all the application shortcuts? I'm still trying to work out why this has happened. It certainly hasn't happened on a whim, and so there must be some logical reason behind it. Microsoft is obviously
trying hard to emphasise the Office suite as an entity and to downplay the role each individual application plays in that space.
**I think this is more than some marketing fiddle, but rather the first steps toward a new application framework for Office. The mindset behind it works like this - since each Office application is, to a greater or lesser extent, pushing hard towards an XML future, each application as presented today is subsumed into the role of a focused toolkit within a fram
ework environment.
**Think of the way that Visual Studio .NET works today to get an idea of where I'm coming from with this. You bolt complete compilers, debugging tools and all sorts of other things straight into the VS .NET framework, and the framework provides all the supporting infrastructure required. I think this is the mindset that's going to be applied to Office.
**This certainly isn't there today in Office XP and won't be fully there in Office 2003 either. However, I can fores
ee an Office 12 or 'Office 2005' for the Longhorn release of Windows where you actually start 'Office' as if it was a tool, then within this Office XML framework you work using whatever tools you want and have bought a licence for. It would certainly help to add much needed fluidity into the licensing process for Office. Currently, there are almost countless packages, and some of the bundling decisions seem a little odd to say the least.
**To back up my hunch, it's interesting to note th
at the forthcoming InfoPath XML application won't be part of any of the Office 2003 suites. This announcement, made only this week, is quite fascinating. Yes, it makes sense that only far-thinking users and developers will need the capabilities of InfoPath, at least for the time being. However, it also sets up an interesting expectation, which goes something like this: because Microsoft isn't putting InfoPath into any Office bundles, it doesn't need to worry about taking it out again in th
e future.
**That's because I think InfoPath is a 'one-version product' whose core functionality will in fact become the 'universal canvas' of the future Office framework product suite, and as such there will no longer be a need or place for InfoPath as a product per se. It will become part of the ether of the Office framework.
**Only time will tell if this hunch is right or wrong, but it's interesting to see how InfoPath, arguably the most important new component in the Office 2003 sui
te, is actually not going to be in any of the suite bundles. You read it here first and, as always, comments are welcome to the usual address.
**64-bit windows
*The announcement that Microsoft is going to release a 64-bit version of Windows for the AMD Opteron 64-bit processor is welcome news indeed. I've seen it running on prototype AMD hardware, and it works very well. Indeed, it could be argued that Microsoft has decided it no longer needs the weight of a company like Intel backing it,
and it can make its own decisions about the platforms it will support.
**I accept that in the past NT was available on a range of hardware platforms. Indeed, one of my oldest and most trusted servers is still an NT 4 box running on the Digital Alpha reference motherboard. It keeps working, year-in year-out, and frankly I can't be bothered to move the services off it onto another box, because I have absolute (if naive) faith that it will keep working until the moon crashes into the Atlan
tic Ocean.
**But back then it was clear that the hardware was either vastly too expensive, or just not available. Did anyone ever manage to see a PowerPC-based NT box actually for sale, for example?
**With AMD 64-bit, it's different. AMD has a compelling platform and Microsoft has decided to support it. Certainly it seems to be a more rational solution than the Itanium family from Intel, which is almost turning into a long-running parody.
**And a key technology here for Microsoft is t
he Common Runtime Language of Visual Studio .NET. Remember that your programming language, whether it be Visual Basic, or C# or C++ or Cobol or Fortran or some weird academic language, all compiles to IL Intermediate Language. The IL code is then compiled by the target CRL runtime on each platform. The IL code really knows nothing about the underlying hardware, or even if there's Windows there at all. So once we see applications coming to market that are written in IL, then the porting iss
ue to new chips like AMD and Itanium, plus anything else Microsoft cares to support, will be much easier.
**When Microsoft said it was betting the farm on the Visual Studio work, it wasn't joking. It's the means by which the company can decide what platforms it works on. The previous methodology of recompiling source code for each platform didn't work. Few applications were ported to Alpha, let alone to MIPS or PowerPC versions of Windows NT. And those that were ported often had any numbe
r of exciting bugs, especially in the slightly more esoteric hardware device driver area. Anyone remember the pain of hardware SCSI controllers on Alpha? I rest my case. Although CRL and IL don't change the problems of underlying driver support, it will make application porting to be a much more seamless thing, especially with the virtual machine offerings Microsoft now has up its sleeve. And in the 64-bit arena, I can only say that we live in interesting times.
HRJon Honeyball is overwhelmed by Microsoft's latest and greatest - Windows XP Plus!L July 2003X
105 (July 2003)i
Advanced Windows
Simon JonesD
SharePoint Services
For companies that require teams of workers to collaborate when producing documents, the big news in Office 2003 will be SharePoint. Virtually all the Office applications will make use of SharePoint in one way or another, and Microsoft's marketing department has begun skilfully intertwining Office and SharePoint in its sales pitches so that you hardly know where one starts and the other stops. So, what exactly is SharePoint and why would you want to use it?
**What Microsoft now refers to
award-winningJ
awarded
awards
awareC
awareness
awashP
vinyl
records
there
existed
movedA
movedonkeys
movefirst
movement`
movementsm
movenexto
moveobject
mover
movers
homeC
walkingF
walkman
walks
walkthrough
wander
wanderingP
wantA
wantedC
timesA
free-to-viewH
free/busy
free/open
free4all
freebase
freebieG
freebies
freebsd
freebsd-style
freebuilder
freecell
freed
freedomP
reenetworks
freephone
freeportu
freerip
freeserveY
freetdsz
freewareW
freeware/hobbyist
freeway
freeze
freezerb
freezesp
freezing
frenchc
french-polishing
frenzy
frequenciesN
frequencyN
frequency-to-pitch
frequentC
misaligned
misapplying
misapprehension
misappropriation
misbegotten
misbehave
misbehaved
misbehaves
lineA
overridden
overrideS
overrides
overridingQ
overruled
overrun
oversaw
overseast
oversee
overshadoww
overshadowed
overshoot
oversight
oversights
oversimplification
oversimplifying
oversized
overstate
overstated
overstating
oversteps
overstockb
overstrike
overt
overtake
overtaken`
overtaking}
overthrow
overtly
precisionQ
preclude
precludes
precocious
precompiled
preconceived
precondition
preconditions
preconfigured
precooked
precursor
predated
predatesd
predators
predecessor
predecessors
predefinedK
predesigned
predetermined^
predicableA
predicament
predicated
predictn
predictability
predictable
ignored
igrafx
ii/233@
iiis@
iis4-datafix@
ill-fated@
illustrates@
illustration
imac's@
image
image's
image-warping@
imageready's@
images
images/ballantines@
imagesetter
imagine@
imagined@
imitators
immeasurably@
immediately@
immense@
impaired@
impatient
impersonate@
implementable@
implementation
implementations
implemented@
implementing@
implements@
implode@
import/export
importance
important@
include@
included@
includepicture@
includetext@
including
inclusion@
incorporate@
increases@
incremented@
indent@
indexes@
indiscriminately@
individuals@
industry@
inept@
inevitably@
inextricably@
infection@
info-annihilation@
20ftF
20kbits/sec
20lake
20mbits/sec
20mbytes/sec
20memory
20mhz
20panel
20presentation
20pty
20readers/universalm
20test&thirdvalue
21-day
21.13mbytes/sec
21.95
210.172.228.117
210kb
212.161.120.54
212.74.64.21
212.74.64.30
212.88.53.125
212.88.53.33
212.88.53.98
2120`
212kb
213.121.147.12
213.86.52.83
2147483647
commandA
command2
commandbar
leaveJ
leavers
methodsB
metro
metroguide
metrolink
metropolis
metropolitan
metropolitan-based
metrowerk's
metrowerks
mexican
mexico
mfc42
David Moss and Jon HoneyballD
Digging deep
If you've ever been faced with disaster recovery, you'll know that it's a problem best approached with a clear mind, lots of structured thought and more than a few sheets of A4 paper on which to carefully write down what you need to do and in which order. Far too often, what was merely a minor drama gets turned into a full-blown crisis because of the well-meaning but ill-informed dabbling of people who don't know what they're doing.
**This is all too frequently the contribution from senio
as 'SharePoint Products and Technologies' grew out of the Digital Dashboard technology that Microsoft started work on about four years ago. Digital Dashboards were web pages constructed from a collection of smaller Web Parts. Each Web Part showed a single nugget of information: a list of customers; a weather map; a graph of sales figures. Administrators could set up these Web Parts and pages, and users could easily say which parts to show where on their own pages.
**This was then extende
d to become a Team website, where users could store documents for collaboration within their team, and provided a home page showing useful information that each user could customise.
**This concept, minus much of the customisation code, was given the name SharePoint Team Services and became part of the Office suite. If you bought Office Developer Edition or just FrontPage, you could design, install and run a SharePoint Team Services website. SharePoint Portal Server was a much bigger pro
duct, which allowed full customisation of Web Part pages and added document check-in/check-out facilities and content indexing. This was a separate, paid-for product and Microsoft claims it has sold 20 million licences for it to date. Large corporations must like the product.
**With the release of Office 2003, Microsoft has totally revamped the SharePoint range of products. SharePoint Team Services is dead and will no longer be supplied with Office. In its place, welcome SharePoint Servic
es, an add-on for Windows Server 2003. SharePoint Services may inherit the SharePoint concept and the name but it has been totally rewritten in ASP .NET. It uses SQL Server to store all its data, while communication between Web Parts, SQL Server and Office is all done using XML. If you don't have SQL Server, SharePoint Services will install MSDE (Microsoft SQL Server Desktop Engine) and use that instead.
**SharePoint Services won't be ready in time for the launch of Windows Server 2003 i
n April, but it should ship about a month later via the Windows Update website. SharePoint Services includes the Web Part engine, check-in/check-out and content indexing. You can create multiple SharePoint websites in your organisation, but they'll remain separate entities. The new SharePoint Portal Server adds a layer above these sites to integrate them into a single, cohesive site for your organisation. It also brings more personalisation in the form of your own web pages, which act for
you as your portal onto the world and, for others, as information about you, the projects you work on and your areas of expertise or interest. Microsoft now refers to SharePoint Services and SharePoint Portal Server collectively as SharePoint Products and Technologies.
**Collaborative features
*The main use of SharePoint will be to facilitate collaboration among team members. These may be permanent teams or ad-hoc teams set up for a particular project - SharePoint doesn't mind if the team
lasts a day, a year or five years. A team could be a set of people from many departments within an organisation who need to collaborate on a document, a meeting or a project. You can even set up social team sites for extra-curricular activities such as the company darts team, squash ladder or brass band. Each SharePoint site has a front page based on Web Parts. To this page you may add lists, discussions, surveys, document and picture libraries. You can even customise the look of a ShareP
oint site by applying a colour theme or editing the pages individually in FrontPage.
**Lists
*Lists contain common information that you might want your team to know. There are predefined lists for announcements, events, contacts, tasks or links, and it's also possible to create your own lists, defining what data should be stored and how it should be displayed.
**Each list you create gets a simple form through which to enter the data, and you may create different views of that data, sort
ing, grouping and filtering the data as you want it. You can create fields for text (single lines, multiple lines or rich text), numbers, dates and so on. It's also possible to attach files to items in the list or associate items with other items in the same list - you can even have fields that look up values in other lists. Lists may be linked to Excel workbooks or Access databases, and such links remain alive and work in both directions.
**Any view of any list can appear in a Web Part
on the front page, which is great for showing summary information such as new announcements, upcoming events, which staff are currently online and useful links. Any list you customise or create may be saved - complete with the definitions of all the fields and views - as a template, allowing you to create clones of successful lists. It's possible to share these templates with other teams throughout your organisation to encourage consistent communication styles.
**Discussions
*Sometimes y
ou need to have a discussion between several people, but they can't all meet at the same place or at the same time. You could use email, but it's difficult to ensure that everyone sees all the comments or for new people to join in partway through a discussion. SharePoint allows you to easily set up discussions where anyone is able to ask a question or post a comment and other users can reply. All the comments are kept so that latecomers may catch up, and the comments are displayed in threa
ded order so you're able to tell which comment goes with which original message. In effect, discussions provide a private conferencing system like a miniature version of Cix or PC Pro's web forums.
**Surveys
*If you want to find out what your team thinks about anything, it's possible to quickly set up a survey to ask them. Each person can answer the survey, but only once, and you may tabulate or graph the results at any time. The results of complex surveys can, of course, be analysed in
Excel, as a survey is just a special type of list.
**Document Libraries
*If several people are co-operating on creating a document, you might store the document on a file server. However, if two people try to edit the same document at the same time, one person risks overwriting the other's changes.
**Microsoft Office applications try to minimise this risk by restricting documents that are being edited to be read-only for the second and subsequent persons that open them. You could use t
he 'send for review' process in Office to email a document to many people to get their changes and then merge all these changes back into the original document.
**SharePoint lets you set up shared document libraries where users can save documents needed by the team. You can either save a document to a folder on your hard disk or a local network folder and then upload it to SharePoint, or you may save directly from Office applications to a SharePoint document library. The File Open and Fi
le Save As dialogs will show a SharePoint website and allow you to navigate to the correct library.
**Documents in SharePoint may be checked out to someone so that they - and only they - are able to make changes to it. Once they're finished, they can check the document back in so others are able to see the changes and check the document out to make their own changes.
**Picture Libraries
*SharePoint enables you to create a special class of document library for storing pictures, which sho
ws thumbnail images of the pictures rather than just a list of filenames. You can give each picture a title, a description and some searchable keywords, and record the date it was taken, which then enables other users to sort and filter the pictures. You could use a picture library, for example, to hold publicity shots for your company or to share pictures of a social event.
**Document Workspace and Live Attachments
*Documents in a document library can be promoted to separate Document Wor
kspaces, which have their own access lists, document libraries and lists of tasks for the members to carry out. It's possible to create such a Document Workspace by emailing a document to one or more people - Outlook's Attachment Options Task Pane allows you to choose to send the document as a regular or live attachment, and the latter choice will create a Document Workspace.
**If you choose to create a Document Workspace, whenever a recipient goes to open the attachment Office will chec
k the Workspace and give them the latest version from there rather than the original copy that was emailed to them. Thus, everyone always sees the most up-to-date version of the document with everyone's changes rather than just the original version. This means people don't waste their time duplicating comments or changes and they can react to other people's ideas in a timely fashion. This facility should, if used intelligently, greatly reduce the time it takes a group of people to create o
r revise a document.
**Meeting Workspace
*An interesting variation on the Document Workspace is a Meeting Workspace. You can either create these from scratch in SharePoint or when you set an appointment in Outlook. A Meeting Workspace may contain an agenda, a set of objectives or decisions to be taken, a document library of documentation to be read before the meeting, a set of tasks to be allocated, a list of things to bring to the meeting and a list of the attendees. All the attendees ge
t an email containing a link to the Meeting Workspace so they can read the agenda beforehand and the minutes or outcome of the meeting.
**Alerts
*Any user can ask SharePoint to send them an email if someone adds, edits or deletes a specific item or any item from a list or library. These emails may be generated at the time of the change, once a day or once a week. The emails contain links to the item that was added or changed, to the item's list or library and to the SharePoint site. This
makes it easy to see what other people are doing on the SharePoint site.
**SharePoint Portal Server
*All the abilities I've discussed so far are possible using just SharePoint Services, which comes free with Windows Server 2003. What extra then does SharePoint Portal Server give you? Well, as things stand, your departmental SharePoint websites remain separate entities: if the finance department has a SharePoint site and sales has one too, you won't be able to easily navigate from one to
the other or do cross-site searches. SharePoint Portal Server manages site aggregation, so your departmental or team sites can be classified, browsed and searched from a single company-wide intranet home page. IT departments may lock down specific Web Parts or page zones so that, for instance, users can't hide or remove a vital company announcement Web Part from their view of a page.
**SharePoint Portal Server also introduces personal web pages that users can customise, both to show them
selves the information sources they need to do their job and to show others information about themselves and their activities. Links, announcements and other information may be targeted to individuals or groups by their job roles, security groups or any other criterion you can think of. If you have legacy databases or corporate business applications such as SAP that require separate logon credentials to retrieve personal data, SharePoint Portal Server and BizTalk Server can marshal lists o
f usernames and passwords to automate these credentials and retrieve your data from such systems without having to ask you to sign in again.
**Office Web Parts
*As well as the standard Web Parts for showing lists, appointments, users and so on, there'll also be some parts that are more closely tied to Office applications. For example, Microsoft is planning Web Parts to show spreadsheets, PivotTables and PivotCharts. The spreadsheet Web Part can be used to show data from Excel workbooks or
other data sources in a familiar format but live on a SharePoint page. PivotTable and PivotChart Web Parts will show summary data from SQL Server or the spreadsheet Web Part in a way that can be analysed easily to spot trends or exceptions.
**Another Web Part under development is called the Web Clipper, which shows a small section of some other web page on your SharePoint website. You can select individual tables or images from any page and the Web Clipper Web Part will show just that da
ta, refreshing it from the source page whenever necessary. Microsoft is also working on other Web Parts that SharePoint administrators will be able to download from an online library, but these aren't finished yet. They will undoubtedly include NYSE stock price finders, weather forecasts (for US cities) and similar information feeds, but we might get lucky and find that some of them are useful outside of the continental US or the financial communities.
**In addition, since the Web Parts u
sed by SharePoint Services and SharePoint Portal Server are just special types of ASP .NET controls, Microsoft will shortly make available a Web Part Library template for Visual Studio .NET, which will allow developers to create their own Web Parts. I haven't had a chance to try this out for myself yet, but the presentations and walkthroughs I've seen so far make it look reasonably easy.
**In Conclusion
*I've had three weeks now of building SharePoint sites using SharePoint Services Beta
2 and I'm impressed by the stability, ease of use and the sheer usefulness of the whole thing. I'll be investigating SharePoint Portal Server in the next few weeks when I've got a spare machine to install it on, and I'll be interested to try my hand at developing Web Parts when Microsoft releases the templates for Visual Studio .NET.
**Should you be looking at SharePoint too? Well, I'm certainly going to be recommending that many of my clients now look at SharePoint Services as a cost-ef
fective way to implement business communication systems - it gives you an awful lot of bang for just the price of upgrading your server to Windows Server 2003.
**And remember that if you bought Software Assurance on Windows Server 2000, you've already paid for the upgrade in any case. Pricing for SharePoint Portal Server, on the other hand, is yet to be fixed, so we'll have to wait to see if it also represents such good value for money.
HFSimon Jones guides you through Microsoft's refined group-working toolsL July 2003X
105 (July 2003)i
Applications
what'sB
customisedF
customised-for-his-p
customises
customisingd
customizeX
customizesearch
customs^
custsupp/downloads/u
usingA
ritlabsG
ritual
rituallyb
rivalt
rival's
rivalryT
rivalsZ
riven
rivendel
river
rivers
rizla
rj-45
rm-ed
rmbatch
rmdir
rnext
rnr'si
roadF
road-diggers
roadmapp
roads
roadworks
second'sC
second-generation
second-guess
second-guessed
second-handb
second-pre-beta-2
second-stage
secondaries
secondarilyL
secondary]
secondsG
allowA
educationalG
educators
edward
edwards
eecis
eeprom
eerie
eerily
effectivelyB
effectivenessD
effectsC
efficiencyQ
efficientH
efficientlyQ
effortD
effortsx
adversaries
adverse}
adversely
advert
advert-free
advert-related
advert-strangled
advertise
advertised
advertisement
advertisements
advertiser
advertiser's
advertisersm
advertises
Paul LynchD
Sound of silence
Silence has been taking up a surprising amount of my time lately, giving me a deeper understanding of the term 'grim fascination'. I own a number of MP3 players, including several PDAs that are perfectly capable of playing MP3 files, a Linux machine that records and plays MP3s remarkably well, several desktop machines of various persuasions, and a laptop with a good audio player. However, instead of using any or all of these, I've been experimenting with a generic command line-style Java M
r members of a company (especially in the SME/SoHo environment) who have even been known to take the most drastic of steps to fix some tiny, niggling problem. Believe it or not, installing the operating system - note I said 'installing' and not 're-installing' - has been used before now as a 'cure' for Word having the wrong paper size set as default.
**While I agree with anyone who objects that maybe it should be made easier for mere mortals to debug these things, it still doesn't excuse
the wholesale fiddling that so often occurs. Attempting to fix the newly magnified disasters that result becomes a task rather like following the elephant at the circus, armed only with a bucket and shovel.
**And, since I'm having a general moan, could I point out that care should be taken when dealing with the person who has the hapless task of digging your computer network out of the hole into which you tipped it. Asking every five minutes if the network is going to be working again s
hortly is the sort of interruption that's guaranteed to turn the most mild-mannered of IT consultants into a green facsimile of the Hulk at a moment's notice.
**Having said that, we're often to blame for bad communication and the setting of unrealistic expectations, and the software itself sometimes doesn't help. A good example is what happens when an Exchange Server crashes. Experience shows that this is rarely due to any grotesque instability in Exchange Server itself or the underlying
Windows Server, as such installations tend to be standalone application servers that aren't required to do other tasks (like running the office Quake network server) and hence should be stable. But nothing short of full-scale clustering can get around a bona fide failure of the underlying hardware.
*Getting a dead Exchange Server back up and running is one thing, but getting the underlying OS sorted out - especially if this is a single-server, small-business solution where you have Exc
hange Server 2000 installed on top of Windows 2000 with Active Directory, all running on the same box - is another level of complexity and hence takes time. Meanwhile, it isn't unreasonable of the users to be worried about whether they're going to get their files and emails back.
**In today's world, getting access to vital emails can be more important than the file system itself, because often the files are stored as attachments to emails, and emails form the lifeblood of the ongoing comm
unication of a company, both within and without. So you can imagine how tense things can get if you discover that you can't get Exchange Server running and that the backup tape drive has fallen over and you're worried that the EDB files are lost.
**Let's describe a possible scenario. You're presented with a backup tape from 15 months ago, and its owner cheerfully states that there's an email on there, now long-deleted from the current system, which must be retrieved. No excuses will do -
you have to get this email back. What do you do? One obvious solution is to build a new server, set it up to pretend to be the main server and then to do a recovery of Exchange Server onto that server and thus bring the data back online. You can then hook up a copy of Outlook to the new server and dump out the required emails to a PST file.
**All of this takes time, and not only does time equal money, but often 'now' really does mean 'now'. Worst of all, you're due at the pub in an hour
and that certainly doesn't leave enough time to do a scratch install, Exchange Server install, data recovery and email extraction. What do you do?
**And here's a second common scenario. You decide, after much deliberation, that your Exchange Server is a bit 'Donald Ducked' due to an unidentified permissions screw-up buried somewhere deep in the system. Maybe you'd done something unwise like DCPROMOing Active Directory onto your Exchange Server member server (or, worse still, DCPROMOed it
off again). Anyway, you decide it would be a good idea to back up all the Exchange Server data, uninstall and reinstall Exchange Server. Then you can restore the data and everything will be fine.
**So you trot off down this apparently sensible path, only to discover near the end that the backup information won't restore onto the server. You had complete faith in the workability of your backup/restore procedure so hadn't bothered making a PST local file copy of your Exchange Server inbox,
but now the beads of cold sweat suggest this wasn't such a bright assumption. What do you do?
**PowerControls
*Well, the first of these scenarios happened to me last week, and the second happened to a friend only last night. Fortunately, I had a tool to hand that made the problem-solving a matter of a few mouse clicks. Not only does it genuinely work, it works so well system admins will have to find new ways of padding out their timesheets. The tool in question is called PowerControls 1.
1 from Ontrack, a firm well-known for its clean-room hard disk-recovery facilities.
**Ontrack's programmers obviously wrote PowerControls to help unpick the sort of mess they must see on an almost daily basis, and its basic premise is simple - to open up the core EDB database files from an Exchange Server installation and enable you to browse them and extract any item at will.
**To understand why this is significant, let me point out as clearly as possible that you don't need to have E
xchange Server installed or running, or even within 100 yards of the scene. This tool works directly on the underlying data files themselves. You don't need to be working on a server or anywhere near Active Directory. You don't need to be on the source network because there's no authentication required or desired. PowerControls just unpeels the EDB database files and lets you peer inside.
**At this point, you may be wailing that your Windows Backup creates a BKF file so you don't have the
EDB files. No problem - one of the functions of PowerControls is a Wizard that you can point at a BKF file, which looks inside and extracts the EDB files for you. And if you have everything stored on a tape drive, you don't need to recover to disk, as PowerControls will control your tape drive and extract from the EDB files held on that.
**How is all this presented to you as a user? The program looks rather like a mutant form of Outlook: in the upper half of the screen is an Outlook-esqu
e tree view of your mailboxes, which is the source window, while the lower half is the destination window. Let's go back to my first disaster scenario - recovering an old email from a backup file. PowerControls' Extraction Wizard pulled the EDB files out from the BKF backup file and dropped them onto the local hard disk. I specified that I wanted to use a local PST file as the destination container, and once I'd specified this PowerControls went to work and opened up the EDB file for my in
spection.
**There I could browse all the mailboxes that were stored on that EDB storage group, drill down into each one and then be presented with the usual Outlook-alike set of containers - inbox, drafts, deleted items, contacts and so forth. I drilled down into the inbox to a subfolder the user had created and found that vital email message complete with its attachment of a Word document. A quick drag-and-drop deposited a copy of the whole thing into my PST file. I could then take the
PST file to the anxious user, mount it up into Outlook and magically recover that old email into his inbox for him.
**Total time from start to finish? Under five minutes, honestly. See what I mean about padding out the timesheets? Perhaps I should have acted up a bit, making several cups of strong black coffee and being heard to mutter ostentatiously 'it's not easy this, it's taking time, I'm going as quick as I can' before disappearing back into the machine room, putting my feet up and
reading the newspaper for at least a couple of hours so that a more credible length of time had elapsed.
**But I have to come clean with you - it was the matter of a few minutes only, and that included learning the UI and reading the online help to make sure I was doing things in the right order. It would have taken longer if I'd had the backup on a tape, because some degree of tape shuffling and reading would have been required. But this was a disc-targeted backup system that was second
arily backed up to tape, so I had a shiny BKF file to play with already.
**In the second scenario, my friend's home/office Exchange Server installation was acting a bit wobbly so he decided that a backup, uninstall and reinstall was just the ticket. Halfway through the evening, I got a sheepish phone call telling me that it had all gone horribly wrong and he wasn't able to restore the data. Maybe it might be possible to use this fabulous new tool I had been raving about only a few hours
earlier? It would have been churlish to refuse, so he made a terminal-server connection into my network and transferred his BKF file over.
**PowerControls enabled him to recover his entire inbox structure and all the mailboxes, contacts and calendars. Although I'd been impressed enough with the recovery of just one email in my initial test, I was amazed that it recovered everything out of a complete Exchange Server installation. Out of more than 20,000 messages, only a few weren't recove
rable, the software complaining that they were corrupted (almost certainly by some age-old anti-virus software). All the contacts, calendars, emails and so forth were dropped into the PST files for subsequent recovery into a fresh Exchange Server installation.
**So PowerControls works, and works brilliantly. Are there any downsides? I should point out that you'd better keep your backup tapes, BKF and EDB files safe and secure because there's no password or security lockout here - load th
e file, choose the mailbox and away you go. There's nothing to stop you from diving into any of the data and, if you can do it, any hacker armed with this tool can do it too.
*The only thing that makes this relatively unlikely is the high cost of the software - the standard version costs
699 per server and recovers up to 100 mailboxes. The Business version costs
999 per server and recovers up to 250 mailboxes, while for larger installations there's a per mailbox and per server-based pri
cing scheme. It's not cheap, but it's certainly not expensive when you need it. It's perhaps a little way over the price pain threshold for 'I'll buy that just in case I need it', but when you do need it nothing else will do.
**I should also point out that PowerControls doesn't remove the need for a proper disaster-recovery plan for Exchange Server. Indeed, such an event can now become a two-pronged affair - identify those key users who must have access to their mailboxes right now and u
se PowerControls to extract their data immediately. At the same time, start on the full recovery path. Some of this will get easier with Outlook 2003 when coupled to Exchange 2003, if you set up the local cacheing facility. In this scenario, a workstation just goes 'offline' from the server when the server disappears, but its user can still work with a local cache of their data. However, this doesn't work with earlier versions of either software unless you've told it that each machine is a
roaming laptop, which introduces its own headaches.
**PowerControls 1.1 has proven itself twice in two difficult situations and I can't ask for more, so recommendation is mandatory. Contact <A HREF="http://www.ontrack.co.uk" target="_BLANK">www.ontrack.co.uk</A> where you can download a demo version.
**Other Exchange tips
*There are a few other things you can do that will help if Exchange Server goes sour. First, look at whether you have enough disk space and ensure Exchange Server do
esn't actually delete things when you tell it to in Outlook. This is called 'mail retention' and it can save your bacon. This is especially true if you set up Exchange Server to hang on to mailboxes even once you've administratively deleted them: the item to look for is called Deletion Settings and you'll find it on the Limits tab of the properties of the Mailbox Store.
**Drill down to First Organisation/Servers/Servername/First Storage Group/Mailbox Store and right-click on that and cho
ose Properties. On the Limits tab, you'll find a group called Deletion Settings, which has two entries: 'Keep deleted items for day(s)' and 'Keep deleted mailboxes for (days):'. There's a checkbox in here too called 'Do not permanently delete mailboxes and items until the store has been backed up'. What are good settings here? Start with 30 days for each date setting and turn on the checkbox. This way, you'll be sure that everything got into a backup set before it was finally deleted.
Finally, bear in mind that some companies operate a 'delete it all' policy, which is especially common in heavily regulated spaces like pharmaceutical and financial companies. In these organisations, you have to operate a 'clean inbox' policy by deleting anything after a specific date - often 30 days. The logic behind this is that anything considered important and worth keeping should be put into a company document repository, like Documentum or an equivalent, and that an email inbox is to
o small a bucket in which to keep important information.
**Other firms insist that everything must be filed into public folder trees in Exchange Server, structured on a project, team or departmental basis, and thus there's no need for information to lurk in an essentially 'private' email store. I can see the merit to such solutions, provided there's a proper repository for information to be held for the long term. All too often, though, companies just impose some arbitrary inbox storage
}limit on the user, who then has to frantically delete anything of any size to keep under the mailbox limit. Not only is this a grotesque waste of time in an era where disk storage is cheaper than it has ever been, but it also tends toward the inevitable destruction of the long-term information heritage of that company. Such info-annihilation may come back to haunt in the future.
HYJon Honeyball goes it alone this month and tackles the thorny issue of disaster recovery.L July 2003X
105 (July 2003)i
Back Office
tasksA
tastex
taxation
taxes
taxidermist
taxing
taxman
taxonomies
taxonomist
taxonomy
tbitmap
tblblogso
tblcatagories
self-restoring
self-sacrifice
self-same
self-scripting
self-serving
self-similar[
self-sorting
self-study
self-styled]
self-sufficient
self-supporting
self-tapper
self-test
self-tuning
self-updating
selfcert
selfcontained
selfishw
selfless
sellI
individualC
transferredL
transferringN
transfersi
transformS
transform-centered
transformationT
transformationsU
transformative
transformedH
transformer
transforming
sparencies
transparencyH
transparent
disastersL
disastrousY
disastrously
disbanded
disbeliefp
disca
disc-duplicating
disc-targetedL
disc-write
discardZ
discardability
discardable
discarded
disconnection
discontinue
discontinued
discount
discountingZ
discountsX
discouragec
discouragedR
discoursec
distribution@
distributors@
dither@
divided@
dividend@
dlink@
dllimport@
subject
keeps
raising
windows
do-anything@
document@
document's@
documented@
documents
documentum@
dodragdrop@
regulation
investigative
powers
anythi
doesn't@
doframe
doing
dolby
dolly@
domain
domain's@
domainname
dominated@
don'tt
donald@
done@
dongle
donkey@
dontcache
door's
doors@
dos-based@
double-agent@
double-checked
ensure
ensured@
entirely@
entitled@
entry@
entry-level@
equalisation@
equates@
equipped@
O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+
P3 player on my laptop (a good but broken Apple PowerBook G4).
**I saw someone demonstrate this player as an example of how conformant Mac OS X is now to the world of standards, deploying this simple, cross-platform MP3 player as the core of an application that uses AppleScript to drive it. Since then, I've spent hours downloading and experimenting with the player, my excuse being that a significant proportion of my time is devoted to Java programming and a project like this is pretty clo
se to the bone.
**The player I'm using is called jmpg123 (from <A HREF="http://www.mpg123.de" target="_BLANK">www.mpg123.de</A>) and comes with a copyright date of 1998, which is surprisingly old for something that still works. It requires Java 1.2 or better, so I've updated my laptop to Java 1.4.1 and luckily it still works.
**The grim fascination lies in the fact that it plays music with accurate pitch, although extremely slowly on my laptop. Being used to analog music sources, I'd e
xpect a slurring of pitch if the player can't keep up, but this doesn't happen with jmpg123. It turns out, then, that this app is a reasonable test of Java performance and can be used directly to compare the relative performance of different computers. When running, it bumps the cycles graph on my CPU monitor from less than 5 per cent utilisation to around 20 per cent so isn't a heavy load. Getting it to work isn't difficult either and is even easier if you read the accompanying documentat
ion. Download the 100 per cent pure Java player from <A HREF="http://www.mpg123.de" target="_BLANK">www.mpg123.de</A>, open the archive file and, assuming you have the default Java classpath set up - and you should know if you've changed it - it will run with the command line:
**cd jmpg
*java -cp jmpg.jar jmpg/jmpg123 <music.mp3><P>
*where <music.mp3> is the correct path to some playable MP3 file. Just locate the file and drop it on the Terminal window to fill the path in automatical
ly. The default options play the MP3 file through the native sound output device, but if this doesn't work there are further options to pipe the sound output into any device you wish.
**Using these same principles, it's easy to install and run any regular Java package on Mac OS X. Many of the main Java apps now come with specific Mac OS X installers, allowing their developers to pre-configure the products to Mac OS X conventions, which usually means just an application wrapper to give it
a pretty icon, though it's possible to go further. JEdit is a well-known Java editor (oriented towards programmers) whose graphical user interface is entirely written in Java. Installing JEdit is a simple matter of going to <A HREF="http://www.jedit.org" target="_BLANK">www.jedit.org</A>, navigating to the Mac OS X download page, clicking the download link, waiting for the file to load automatically, and copying the JEdit application to a suitable place on your hard disk.
**Since the enti
re interface is in the Java application, it isn't totally consistent with Mac conventions - for example, the Application menu is in the main window rather than in the Application menu bar across the top of the screen - but at least this is consistent across different platforms, which is some advantage.
**Java on PDAs
*I may spend a lot of time programming in Java, but I don't do it using (or targeting) PDAs, for a number of good reasons. In past columns, I pointed out that there weren't
any good PDA implementations of Java, but I no longer believe this to be entirely true. A second reason is that Java implementations for PDAs are sufficiently different from desktop and server implementations to be a pain. Lastly, and most importantly, the only effective Java development environments I've found run on desktops and cross-compile to the PDA - a development tool that runs effectively on the PDA itself remains elusive.
**Writing programs on a server or desktop PC with an exte
rnal device as the target is nothing new, whether it's an embedded processor, a PDA or a nuclear reactor control system. In fact, for professional programmers, developing in the target environment is rare, with PCs and Macs proving the exception rather than the norm. And even among PC developers, their working machine's spec is typically very different from the application it's intended to run on. So it isn't surprising, then, that there are no good tools for professional development that
actually run on a PDA. There are plenty of development tools and utilities out there, but the distinction between tools for professional and casual use is very wide.
**Although a professional programmer, I'm also a casual user of the systems I develop for and so often use tools intended for the casual user. For instance, I used to employ a Forth programming environment on one PDA that I also programmed for on a desktop machine. I didn't actually develop in Forth, but used it as a kind of
super-programmable calculator to do quick sums that I couldn't remember how to solve on the desktop; for example, Forth makes it simple to calculate with and convert between number bases. I could easily see myself developing in Smalltalk or Java on a desktop system and using a PDA implementation of the same language as an effective aid to test out small pieces of code.
**However, for Java, the right tools for developing on the desktop or laptop don't carry over into PDA implementations.
On the desktop, I'm going to be developing graphical apps using Swing, the graphical user interface framework for the full version of Java, but neither of the Java implementations for PDAs - J2ME and PersonalJava - include Swing.
*Instead, you have to use the older and unpleasant AWT, which is like writing web scripts using ASP .NET on the desktop and ASP 3 on the PDA, although to be fair ASP 3 is slightly worse to program in than AWT, and ASP .NET slightly better than Swing.
**The oth
er problem is that of optimising programs for different device models. When programming for the desktop, I assume the user could be using a range of display sizes, that they probably don't use 640 x 480 any more but could scale up to 1,600 x 1,200, and this is all handled for me by the operating system. With PDAs, there are relatively few fixed sizes, resolutions, colour display depths and even available hardware buttons on the devices.
**This was certainly the case for the Symbian OS, u
ntil a proliferation of ports meant they gave up their system of fixed models, and much the same is true for Palm OS and Pocket PC. This means that for PDAs a lot more effort, proportionately, has to go into explicitly writing code to handle different device layouts.
**Possibly the best PDA Java implementation at present is the one for the Sharp SL-5xxx range, and it should be the most rewarding to experiment with. However, it does use a port of J2ME so suffers from some of the problems I
just mentioned. I'd be interested if any readers have experience of developing seriously in Java on their Sharp, or any other PDA development system for that matter.
**Messing about in boats
*Continuing along this same line of thought, a reader asked me: 'I'm a web/systems developer who also delivers yachts as a bit of a sideline. I'm thinking about how I can continue my programming work while I'm on a delivery. Laptops are out because of their power requirements, and in fact the only th
ing that will do is a PDA with replaceable batteries.'
**This question rattled the cages of several of the Real World editors. Unfortunately, the answer has to be 'No' to the question as posed, for all the reasons I've outlined above. Software for the casual programmer certainly exists, but you'd be restricted to using an interpreted language that runs directly on the PDA itself - not very portable. The biggest factor militating against using a PDA for this sort of development work is tha
t it doesn't provide the facilities of a full-scale professional development environment. The closest you might come would be the Sharp SL-5xxx series I just mentioned, but I haven't seen any suitable tools, nor could I imagine how a desktop-derived development system could be successfully ported to such a restricted display.
**Most PDAs nowadays have built-in rechargeable batteries and few, apart from the bottom-of-range Palms, will still run from disposables. Having said that, I wonder
if the question is too restrictive, as most yachts should be able to run a small generator that could provide decent-quality power for recharging PDA batteries, or even to run a reasonably frugal laptop. Also, I can think of few PDAs that are suitably rugged for the wet, rolling boat environment, even with a generator. A ruggedised laptop may be more feasible and could support a professional development environment as well.
HIPaul Lynch puts on his programming hat and fishes around in the Java sea.L July 2003X
105 (July 2003)i
Mobile Computing
Brian HeywoodD FX online
One studio task that can take up an inordinate amount of time, considering its modest technical difficulty, is the adding of sound effects to an audio project. This kind of job is frequently required for both music CDs and soundtracks for video projects, and a need can also arise in multimedia applications - for instance, sounds that are associated with button clicks or page turns.
**The problem has nothing to do with the actual process of adding the sampled sound to your project, which i
deeperM
deepest
deeplyF
deeply-nested
deeply-probing
deerfield
deersoft
deesser
deface
defacement
defamation
defamatory
defamed
defamer
fectors
defects}
defencem
defencesA
defendantt
defendant's
defender
defensiveY
defer
deferred
deficiencies
deficient
deficit
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+
O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+
O+O+O+O+O+O+O
+O+O+O+O+OO+O+
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO
+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
so-called
society@
sofa@
softartisans
software@
software's@
software/hardware@
soldering@
solid-state@
solution
solution-specific
solution/steve@
solve@
mers@
spare@
spawn@
specifically@
specifications@
speed-up@
spiral@
spool@
springs@
square@
stackable@
standard@
staple@
started
starter@
starters@
state-of-the-art@
statutory@
stereo@
stern@
sticker@
still
still-tight@
sting@
store@
stores@
straight-on@
stream-orientated@
streets@
strengths@
strings@
structured@
studio's@
stumping@
style>@
submitted@
subsystems@
subwindow@
successors@
things
online
frequ@
suffix@
catalogI
catalog's
catalogs
catalogue|
catalogue's
catalogue-updating
cataloguesT
cataloguing
catalyst
catalytic
catastrophe
catastrophes
catastrophic
catchE
+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+
realiseG
realises
ruleE
rulebook
+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
s a relatively simple procedure given virtually any modern PC-based digital audio editor. No, the problem arises from the amount of time it takes to find the sound effect samples in the first place.
**Like a lot of people who've been working in the audio industry for some time, I've collected over the years a small library of sound effects so that I can quickly lay my hands on a range of different sound samples when required. In the past, if I didn't already have a suitable sound effect i
n this library, there were two options: either buy a commercial sound effects CD, or go out and record (or otherwise create) a suitable 'noise'.
**Both these approaches can be time-consuming and expensive. While there's a large number of commercial sound-sample CDs available, good ones tend to be costly and you may have to buy an entire set just to get the one effect you need. The alternative, recording a sound effect from scratch, is an art in itself and it can be difficult to get usabl
e results.
**The usual solution to this problem in a big studio situation is to keep a database of sound effect samples categorised by type available on the studio network. This means whenever you need a door-slam or the sound of a gunshot for your movie soundtrack, you just search the database and select a suitable audio file from the results.
**Now, since the development of the Internet, such a facility no longer has to be maintained in-house, and a number of companies provide sample
databases online. The idea is that you can search their databases for the sample sound you need, audition it at a low resolution from the list provided, and download the one you like best. Then you pay for the sample via an online credit card facility or by setting up an annual subscription if you're a heavy user. This pay-per-sample approach makes sense if you're a fairly light user, since you can download just what you need rather than having to buy the whole set.
**The payment covers
all the appropriate royalties for the samples so that you can use them wherever you like, although you aren't allowed to sell them on as raw samples.
**Sound-effects-library.com
*The service I've been using most recently is provided by <A HREF="http://www.sound-effects-library.com" target="_BLANK">www.sound-effects-library.com</A>, which reckons it has more than 200,000 high-quality sound samples available for download. These samples include the whole BBC Sound Effects Library - all 165
CDs - plus effects from films such as Batman and Robin, Clear and Present Danger, The Fugitive, Tomb Raider and Star Trek: Nemesis to single out but a few. As well as straight sound effects, it contains music tracks and samples including drum loops and breakbeats. You access the database of samples using your standard web browser, enhanced by a Shockwave plug-in that can be downloaded from the website.
**There are two ways to search for sounds on this system: a simple keyword search or a
more advanced version that lets you specify a sample category. For instance, when I did a search on 'air raid sirens' using the simple keyword interface, it came up with over 800 matches including all samples that contained the words 'air', 'raid' or 'siren' in their description. However, after restricting the category to 'SIRENS', the list was reduced to a more manageable 26 hits.
*You can also specify multiple categories with a Boolean AND relationship to further limit the choices displ
ayed - for example, 'AIR' AND 'RAID' reduced the results list to 22 hits but meant that some appropriate samples were excluded from the list because they weren't included in both of the categories I defined. It's also possible to set the maximum or minimum length of the samples in seconds to further refine the search.
**Another way of selecting samples is to browse the library using the alphabetic list of sound-effect categories. The list is refined by typing in partial words, so typing i
n 'pipe' gets you 'BAGPIPES', 'AEROPLANES:PIPER', 'CHURCH: ORGAN:PIPE' and so on. Once you have a list of samples, you audition them by highlighting a sample and applying the tape transport controls on the search results panel. Samples you play in this way have a 'hiss' signal added to them, presumably to prevent piracy.
**You can also audition a file by pressing the Shift key and using the right arrow key to play the sample, the left arrow key to stop playback, and the up and down keys
to scroll through the samples list. This second means of playing samples is a lot quicker and dispenses with the added hiss, which makes for a more pleasant selection process (but makes you wonder why they bothered adding the hiss in the first place).
**When you find a suitable sound sample, you add it to your shopping cart by clicking on the appropriate button. You can afford to be generous with multiple choices at this stage, as files in the cart can also be auditioned for you to furthe
r prune your selection at a later stage. It's possible to save the list for a later date, even between sessions, presumably by creating a 'cookie' on your hard disk.
**Once you've decided which samples you're going to purchase, you check out and select the way you want to pay for the session and the file delivery method (download or mail order disk), the file type (AIF or WAV) and the sample rate (44.1kHz or 48kHz). You can either pay online via your credit card or subscribe for unlimite
d downloads for
999 per year, which is worthwhile if you expect to be downloading more than about 200 samples per year.
**Once the transaction has gone through, you're presented with a web page with links to the files you've purchased, encoded as ZIP files. Just beware that these files are usually in stereo, so they can be quite large - 175KB per stereo second for audio sampled at 44.1kHz. You also receive an email with links to the files, so you could conceivably send them to a second l
ocation, say, if you were working at one office but needed the files to be accessible at another.
**This represents my ideal application of the Internet, since you can basically offload the hassle of maintaining a comprehensive sample library onto someone else's shoulders. The nature of the Net also means you can access this information at any time of the day or night rather than having to wait for business hours to purchase the samples you need. The auditioning feature allows you to dir
ectly select the samples you want rather than having to take pot luck on the basis of a description of the sounds on a commercial sample CD sleeve. Big users of sound effects like audio post-production studios, multimedia developers and academic institutions can also acquire a facility at a fixed annual cost. Instant gratification - sounds good to me.
**MacDrive 7
*If you work - or intend to work - in the audio business, you'll almost certainly come across a situation where you either ha
ve to read or write to a native Macintosh-format disk. This could be because you have to exchange audio with a Mac-equipped studio or perhaps send artwork materials to a graphics house for a cover design. While Macs can read and write PC formats quite happily, Apple users tend to be far less capable when dealing with situations that lie outside their normal experience, and this is especially true of graphic designers who tend to be Mac-based as a matter of course. To be fair to Mac users,
this is all part of the 'philosophy' of the way Apple implements its systems, but it can become a problem if you need to get data across to a 'Mac head'.
**The easiest solution is to deliver the data on a Mac-formatted disk, and I've been using a handy utility called MacDrive to do just this. Now at version 5, MacDrive does two things: it sits in the background and recognises whenever a Mac-formatted disk is inserted into a drive, handling its operation seamlessly, and it also adds a Mac
Drive submenu to the drive's Context menu that lets you format or copy a Mac disk.
**The utility works with all media types, including floppies, ZIP, Jaz, ORB, SyQuest, hard drives, CD-ROMs and multisession CD-Rs. It also handles the 'typing' of files going in both directions - common file types are pre-configured - and offers a means of editing or defining types. This means that if you transfer a file to a Mac user, they'll be able to double-click on it to start the appropriate applicat
ion, as the Mac operating system will recognise the type of file. This works in the other direction as well, with MacDrive adding the correct file extension so that Windows will recognise the file type and start up the application currently configured for that file type. At $49 (
31), MacDrive looks like a bargain. See <A HREF="http://www.mediafour.com/products/macdrive" target="_BLANK">www.mediafour.com/products/macdrive</A> for more details.
**A reader writes
*Reader Micky Lomman wrote
recently to ask me this: 'I have heaps of audio tapes from the 1980s (sad but true) that I would like to burn onto CD via my computer. I've heard this is possible, but that is the extent of my knowledge in this area. What would I need and how would I do it?'
**In some ways, transferring and improving the quality of audio from cassette tapes is one the easiest recovery tasks you can do with your PC, because the main source of unwanted noise - tape hiss - is fairly constant, thanks to the w
ay the playback mechanism works. Getting good results from vinyl disks is far more difficult and time-consuming because of the nature of the medium.
**To get the best results for this task, you'll need a decent sound card - say, a Creative Labs Audigy - and a reasonable hi-fi-quality cassette player. The ultimate quality you can achieve in the process is defined by both the playback machine and the analog-to-digital converters in your sound card. Once the audio is in the digital domain, t
here'll be no further loss of quality except possibly as an artefact of the noise-removal process itself.
**On the software side, you'll need an audio editor that can reduce noise, such as Cool Edit (<A HREF="http://www.syntrillium.com/cooledit" target="_BLANK">www.syntrillium.com/cooledit</A>). Simply connect the output of your cassette machine to the line level input of your sound card, typically using a cable with twin RCA phono connectors to a stereo 2.5mm mini-jack. You'll probably n
eed to adjust the PC's recording level by juggling the cassette player volume control in combination with the mixer level controls on your PC's mixer application. You can do this by selecting the Recordings level panel from the Options | Properties dialog on your standard Windows audio mixer control panel.
**If the recording was made using a Dolby noise-reduction system (usually Dolby B), you may want to turn this off on the cassette player. This will enhance the higher frequency respons
e, which is the first thing to suffer as a tape ages. In general, it's far easier to remove too much of something, like high frequencies, than to enhance them if they're too low.
*Once you've recorded the desired tracks onto your PC's hard disk, you can use the audio profiler from the noise-reduction dialog to determine the frequency characteristic of the tape hiss between two tracks. You then apply this profile to reduce the noise across the entire recording. Be aware that this process do
es affect the quality of the original signal as well, and if the hiss is at a high level you might start hearing 'underwater' effects or other artefacts, including a metallic quality to the original material you're trying to recover.
**Profile-based noise-reduction systems - like that provided by Cool Edit - usually allow you to control the severity of the noise- reduction process, so you should be able to find an acceptable trade-off between the amount of hiss removed and the resulting
quality of the recording. Also, if your tapes are very old, you may need to increase the treble using one of the equalisation options. Just remember to use your ears during the procedure!
**One final point: remember that commercial releases are covered by International Copyright law, and re-recording them for personal use is illegal in a lot of countries - including the UK - since you've only paid for a licence in the original tape format. Some countries do allow this kind of archiving f
or personal use in conjunction with levying a blank tape/media tax, which gets reimbursed to the copyright owners (eventually). You'll need to check out the local legal situation before you embark on this kind of activity. The distribution of copyrighted material on a commercial basis without an appropriate licence is also illegal and may well be pursued through the courts by the copyright holder with the support of anti-piracy organisations.
HdBrian Heywood looks at how to find peculiar noises, stunning explosions and the odd footstep online.L July 2003X
105 (July 2003)i
MULTIMEDIA:AUDIO
Steve CassidyD
Just for KiX
Longevity isn't considered a virtue in the computer business. Even though a recently toughening market has slightly slowed the momentum of its tidal wave-like progress, we're still looking at typical product turnover times of around 18 months (for example, if Microsoft is to be believed, we'll soon be in shock and awe at the introduction of Windows 2003 Server). Despite this continual washing away of the old, though, some old themes have a way of repeatedly cropping up, pushing through all
It's been quite a while since anyone asked me about Usenet newsreader clients or, indeed, about Usenet itself, which has become something of a niche channel in the face of competition from the Web and more immediate chat and conferencing systems. Usenet is far from dead, but it's struggling to rid itself of stigma-by-association with the pornmongers, spammers and virus writers (not to mention those sad trolls and flamers whose lives seem somehow validated by reducing newbies to tears).
the glitz and glamour to keep up the good work in ways you didn't expect.
**I encountered one of these outcroppings the other day while putting together yet another Windows 2000 server. As force of habit dictates, I hauled out the Windows 2000 Resource Kit CD, slapped it in and went and had another coffee. In this particular case, the server build was part of a larger job that involved bringing in a new Citrix MetaFrame server, so I had a bit of a conundrum running around my mind. How co
uld I tell the various login scripts in the environment which PC they were running on, given that the old MetaFrame server hadn't moved its drive letters around (and was therefore booting from the drive C), whereas the new one had and so was booting from the drive M, which lay bang in the middle of the mapped drive spaces?
**A shudder of nostalgia passed over me, because on that Resource Kit CD install was a copy of KiXtart. Those of you whose floor joists are way over-specified may be ab
le to reach far back enough among your PC Pro back issues to find me eulogising about this little utility in 1995. Back then, I was bemoaning the lack of any facility in Windows NT equivalent to the login script feature in Novell NetWare. KiXtart was the best way to get this same flexibility of batch language at login time in a pure NT network as I had been used to under Novell. It's a cute utility, written by a chap at Microsoft Netherlands, and slipped out in line with the main release o
f the OS as a free add-on.
**Now here we are, eight years later. Novell is still around, not that you'd ever notice at our end of the small network market. If you believe the hype, we all now live in a brave new world of Microsoft networking. Active Directory is out there, it's mature and it's keeping all your stuff safe in its utterly incomprehensible grip. So why is this free and unsupported utility from 1995 still being supplied on the Resource Kit CD?
**Because it won't go away, that
's why. A couple of months ago, I wrote about how fast you can make your inter-machine network copies run if you dodge past the junk involved in browsing the remote machine when you pick files up directly and drop them onto a machine name in the My Network Places. If, instead, you stick with the same kind of drive connection as was in vogue in 1987 (never mind 1995), your copies - and other file accesses - will go a good deal faster. And that connection method is, of course, made by mappin
g a drive letter. Now there are a few ways to map a drive letter:
**1) Pick 'Map network drive' from the Context menu of the Network share while you're looking at it in Explorer. 2) Type 'net use f: \\server \share' into a command-line window. 3) Deploy the Net Use command in a login script.
**In the majority of smaller networks, the first method is fine, but I do mean small. With more than three PCs, you should be thinking that one is going to become the 'server' so that the others all
connect to that one. Much beyond three users and everyone from Steve Ballmer to Steve Cassidy wants you to have a real server OS at the heart of the show, and hence you'll need to keep track of who's sharing what.
**The good offices of the 'Net Use' command can sustain you for a few more users, but by the time you're out of the 20s and heading for the 30s things will start to get confusing in an insidious way - your time leaks away looking after the structures that will inevitably arise o
nce you have a gaggle of users to look after. Sure, you can have a different login script for each user. Sure, you can put users into security groups. But consider what happens when you add or remove a server. How can you work out which users were connected to it, and via what combination of drive mappings? Suppose you're splitting an old server in two (preferable in most small operations to just taking out small server A and bringing in big server B), and want to take a whole partition of
f it. Which network shares are mapped to what, and by which user login scripts?
**It hurts your brain working this stuff out, mainly because there's no easy way in the standard login script to express a conditional branch. Nor is there an easy way to work out what security group someone belongs to. Back in 1987, I could express this trivially within NetWare by using an 'If member of...' construct in my login script. It's for this one little bit of syntax, even if for no other feature, th
at KiXtart is so useful. Given a few bits of such conditional logic, you can take all your user logins and consolidate them into a single script file, which being a chunk of vanilla plain text you can print out, annotate, debug and see clearly what it's doing.
**KiXtarting the network
*In the case of my current problem, another KiXtart feature has come to the rescue. As I mentioned, on this Citrix installation that I'm maintaining, there's an older server and a newer one. The old server w
as built so that the Citrix host drives are named C, D and so on, while the new server has taken advantage of the feature that shows local client PC drives within a Citrix session as C and D and the server's own drives as M, N and O.
**In the medium-scale environment where these two servers are installed, this is just about acceptable at a technical level, but regrettably it confuses the regular users. What saves my bacon in this environment is that KiXtart lets me write 'If @WRKSTA="CSE
RVER" then ...', enabling the login script to retrieve the name of the machine it's running on and, in the case of the Citrix server, I want to radically simplify the range of drive letters that people see when logging in locally.
**The range of options provided when playing with KiXtart has matured slowly since 1995, but maturing is definitely the right word. Apart from the central resource at <A HREF="http://www.KiXtart.org" target="_BLANK">www.KiXtart.org</A>, which is where you go to
get the main executable and enough information to get you going, there's now <A HREF="http://www.kixscripts.com" target="_BLANK">www.kixscripts.com</A>, which offers a paid-for commercial script editor and command reference environment.
**This is a surprisingly necessary aide to have once you start playing with one large, central login script. Even though it's possible to move this login script out of action quickly by simply renaming the file, it can rapidly run out of steam as a means
of protection against cocking up the latest addition to your system configuration. It only takes a couple of Domain Controllers before you start chasing the scripts hither and yon, and as Windows 2000 fully penetrates the market it will soon become vital to have as many Domain Controllers as your network will permit. This means the vital facilities within the KiXscripts editor for checking the consequences of a script modification have quickly become an important feature.
**If there's one
area where KiXtart never did perform sensibly, it would be when working with the Registry. An astonishing range of workstation options can be set in several places that don't pop straight out at you without a good deal of looking around. The Group Policy editor is one glaring example, as is the behaviour specified in parts of the Registry relating to both what users can do and what the machine will do in response to that user.
**However, when playing with Registry data, KiXtart stops lo
oking like the batch login language we've all been using for nearly two decades and starts to look a lot more like the Windows Scripting Host (WSH) - an offshoot of Visual Basic. You can deal with most of the drive mapping and user access stuff in a few lines of KiXtart. However, if you decide to save yourself the cost of an expensive user-sensitive web proxy that bars access to the Net according to username and instead uses Registry keys to tell specific users they can use a simpler proxy
, your login script will suddenly bloat from a few lines to a few tens of lines, which has to manipulate Registry keys with painfully long names and an even more painful, wilfully perverse syntax. Is that 'enable the disablement of the proxy' or the other way around? Search me.
**I've avoided considering this option within my login scripts up to now, for several reasons. One is that using such VB-alikes is manifestly programmers' territory. While I don't mind knocking out a two- or three-
page batch script to log into a diverse and complicated network, I draw the line at ten-pagers, eight of which are full of includes, structure definitions, commentary, and long calls to Windows API primitives.
**Another reason is that it's astonishing what will break an executable with so many referring pieces of code. I've encountered large companies in which WSH is now banned, because some virus or Trojan several years ago used an exploit long closed up by a security patch or virus sca
nner. Their IT guys still associate the scripting host with all kinds of Bad Stuff, a wholly typical - technically incorrect but organisationally immovable - reaction that's becoming increasingly common in large outsourced networks.
**And third, there's a longevity issue: WSH has been around for about half as long as KiXtart and, despite the weird disparity in their apparent commercial statuses, I know where I'd rather drop a technical support question. As so often happens given the way
Windows is put together, it seems peculiar that a tool this useful should blossom not quite in the commercial arena, and not quite in the shareware. Microsoft shouldn't quite compete with it (in the shape of WSH), while at the same time it offered a massively overcomplicated structure like Active Directory that deals with a load of highfalutin' structural approaches to user authentication without going the final three steps to attach a login script directly to each user's own records.
In NetWare ten years ago, a single browsing environment enabled you to look at all the parts of your user's setup (groups, scripts and so on), but in the Microsoft world it's a matter of following up a half-mentioned link on the Resource Kit CD, taking the decision to turn your face away from all the official communiqu
s and then searching the Web for an updated version of KiXtart. Is this what product maturity should look and feel like?
H\Steve Cassidy questions product maturity when he's resigned to using old tools for new jobs.L July 2003X
105 (July 2003)i
Networks
continueA
control/bug-trackingq
control/monitoring
controlledV
controllerC
controllersA
controllingF
controlsC
controversyt
conundrumO
conundrumsb
+O+O+OO+O+O+O+O+
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%+
O+O+O+O+O+O+O
O+O+O+O+O+
+O+O+O+O+O+O
O+O+O+O+O+z
+O+O+OO+
O+O+O+O+O+O+O
O+O+O
+O+O+Oz
+O+O+O+OO+O+O+O+O+O+O+O+O+O
2%U%U%U%U%U%
+O+O+
+O+O+OO+O+O+
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%+
O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+Oz
+O+O+O+O
O+O+O+O+O+O+O
O+O+O
O+O+O+z
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%
+O+O+
+O+O+OO+O+
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%+
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+
O+O+O
O+O+O+O+O+O
noticeableE
noticeably
noticedH
notificationG
notifications
notifiedx
incorporateI
incorporatedH
incorporatesZ
incorporating
incorporationy
incorrectO
216.239.53.104G
21600
216gb
21stb
21st-century
22-page
22.2in
22.73
22.95
22000
220000
220mhz
225966
225mhz
228039
229.99
22bis
22khzE
22khz/eight-bit/mono
23-page
rescueO
rescued
rescuing
researchG
research's
research-desk
research/srg
researched
researcher
researchers
researchers/experts
researching
reseats
reselect
resell
reseller
reseller's
resolutionC
tendE
tender
tendered
tendering
tenders
tendsL
U%U%U%U%U%+
O+O+O+O+O+O+O
O+O+O+O+O+
+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O
O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
2%U%U%U%U%U%U
+OO+O+O+
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+O+O+O
+O+O+OO
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O
O+O+O+O+O
+O+Oz
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U
+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%+
O+O+O+O+O+O+O
O+O+O+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
2%U%U%U%U%U%U
welcomeH
welcome
brand-new
world
column
about
designing
welcome
brand-new
world
column
isn't
really
welcome
office
column
which
considering
welcome
first
server-side
column
coming
welcomed\
welcomes
welcoming
updaterF
updatesA
updates/default
updatingA
upgradingG
Tom ArahD
Table manners
It's a hard life for the web designer. Print-based designers can precisely set every feature of a page layout right down to the nearest thousandth of a point, but the absolute precision of PostScript is merely a dream for web designers. Instead, they must work with the vagaries of browser-based HTML interpretation on a seemingly endless range of screen sizes.
**Despair not, though, as it is possible to take control of your web page layouts. You can start by appreciating that HTML's fluid
Ranting aside, Usenet can still be useful, but you need the right access method and some measure of content control. That being so, what newsreader clients would I recommend these days? I've been dreading this question, because the truth is I still recommend the same trio I'd have done several years ago: Agent, Ameol and Outlook Express.
**Blasphemy perhaps, but Outlook Express really is a decent-enough newsreader, provided you keep up with security patches and can't be bothered to downl
oad anything better (and, believe me, a lot of folk can't). I recommend making the effort, though, and for my own needs the best newsreader of the three is Ameol, the standard offline reader for the Cix conferencing system (<A HREF="http://www.cix.co.uk" target="_BLANK">www.cix.co.uk</A>). Ameol provides both dial-up and Telnet access to Cix and stores all message bases and email locally. I keep all my conferencing activity and email within Ameol, but also the few dozen technical newsgroup
s I subscribe to now are stored there as well. I like this integrated storage, because for me Usenet is about conversational flow rather than the downloading of pornographic binaries and so its natural place is alongside my other conversationally oriented activities. Being primarily a conferencing tool, Ameol does a fine job of threading messages, and I can prune and manage my newsgroup bases at the same time as everything else.
**If you're not a Cixen and don't want to become one (and s
hare the debates and information exchanges in our very own PC Pro conferences), the remaining news client choice has to be Agent from Fort
(<A HREF="http://www.forteinc.com" target="_BLANK">www.forteinc.com</A>). Don't confuse this with Free Agent, its cut-down and, as the name suggests, free little brother. I won't bore you here with a long comparison list - visit the Fort
website to read about all the differences - but be advised that the investment of $30 (
20) in the commercial versi
on will quickly repay itself, as it's a highly functional client. Don't expect a modern-looking interface, though, as Agent has changed little over the years - it's best to take the 30-day trial before parting with your cash.
**There are a couple of other options that would best suit an occasional Usenetter rather than the committed conversationalist; namely, Opera 7 (<A HREF="http://www.opera.com" target="_BLANK">www.opera.com</A>) - which has an integrated mail and newsreader that's mo
re than good enough for low-volume users - and my preferred alternative, Google.
**You heard right, Google. To be more precise, Google Groups (<A HREF="http://groups.google.com" target="_BLANK">http://groups.google.com</A>), formerly the highly respected D
Vu Usenet archive, which Google bought, tweaked and relaunched as a web-based archive of millions of newsgroup postings going back a decade or more. Free to use, fast and simple, you couldn't ask for much more. You'll need to regist
er before posting messages, but you can read as many as you like as a guest. Use it like any other Google engine via keyword searches: enter a topic and you get back a list of postings that match. In essence, it filters out the rubbish for you and presents just those postings you're interested in. Unless, that is, your interest lies in images, movies or application code, as Google Groups is a no-binary zone.
**The Proxomitron
*When it comes to my online life, I'm something of a control fr
eak (probably compensating for the fact that Madame keeps me firmly in my place in domestic matters). I like to be the one who says what my browser should and shouldn't display rather than some Laurence Llewellyn-Bowen of a web designer or a marketer who's been let loose with an HTML editor and an inflated budget.
*That's why I've collected various utilities over the years that enable me to do everything from removing corporate branding to filtering out adult material when one of my kids
is online. And if there's some particular irritant that I can't find an off-the-shelf cure for, I'm not averse to rolling up my sleeves and digging around in the Registry or the source code to sort it.
**The trouble with this approach is that you end up with lots of tools for different jobs, and hence lots of learning curves to master. There are some all-in-one solutions claiming to be magic bullets that shoot down all your troubles, but unfortunately more often than not these prove to b
e just extra problems in disguise. So when all of a sudden everyone I respect in this industry was telling me about a product they can't cope without, I decided it was time to get to grips with it even though it has been around for some time.
**To be honest, I've never seen so much agreement among hard-core techies, journalists and consultants - it's spookily reminiscent of Invasion of the Body Snatchers, which isn't a bad comparison given the science- fiction name of this wonder product
, The Proxomitron, and the unreal way it does just about anything you ask of it. Pop along to <A HREF="http://www.proxomitron.org" target="_BLANK">www.proxomitron.org</A> and download the latest version, and then you'll be able to take control of your online life too.
**You don't see many applications like The Proxomitron nowadays - even the installation routine is good, clean fun. No files get copied to folders outside of its own, no Registry changes are made, and no ini files are modifi
ed. Heck, you don't even need the installer, as it's as easy to dump the unzipped files wherever you like and run them. If you need to move them later, you simply do it, and uninstallation is naturally just as painless.
**You've probably guessed from the function hinted at by its clever name that The Proxomitron runs as a local proxy server - a middleman sitting between your browser and web servers out in the great beyond. All page requests go and come back through it, which means The Pr
oxomitron needs to be running if you want your browser to do anything other than look pretty (and empty). It works well with existing proxies and can help you compile and maintain a list of multiple proxies, between which you can switch at will.
**Once The Proxomitron is installed and running, you can start to have fun tweaking it, which involves firing up one of its two configuration boxes. The first is the Web Page Filter dialog, which contains most of the toggle switches you'd expect
- such as banner and pop-up blockers - but also plenty you wouldn't, such as Flash animation killers, forcing pop-ups to display proper browser controls and preventing automatic browser resizes.
**One of my favourites is the re-enabling of right-mouse clicks on those sites that decide to forbid it because they think it will stop you cutting, copying or pasting (they've obviously never heard of screen- capture apps like HyperSnap-DX or SnagIt). Where things get even more exciting for the
fatally obsessed hacker is when you click on the New button and discover the ability to build your own filters.
**However, The Proxomitron has more tricks up its capacious sleeves, as there's a second filter dialog box devoted to HTTP headers, those bits that get sent in each handshake between client and server that most users are either unaware of or don't care about. They should care, because it's here you can easily find your privacy frittered away.
**Maybe that's too strong, as the
majority of websites don't do any cloak-and-dagger work behind your back, but it doesn't hurt to know what's really going on, does it? The Proxomitron not only lets you see these hidden headers, but also edit or delete them, with most of its options again just a checkbox (or two) away. This dialog box has toggles for both inward and outbound header actions, and its default selections include the ability to 'fake' or kill an outward cookie, to hide referrer details, and to toggle between a
selection of browser client IDs such as Netscape, Opera or Lynx (and, of course, you can add your own options to this list at will).
**The whole Proxomitron thing has a feel of those good old days of online community: while it isn't open source, it's free, nothing is crippled and there are certainly no nag screens (and even if there were, you get the feeling there would be options to toggle them). The author simply asks for anyone interested enough to send him music by the Japanese trio
Shonen Knife, who he seems to really like.
**A lot of thought went into the development of this application, and the author admits that he wanted not only to cure the web irritations of the time (remember the Netscape-induced blinking text nightmare) but to future-proof it so that it could cope with anything the Web would throw at it in years to come. None of its default filters are hard-coded, so you can edit them to your own needs and adapt them whenever the Web fights back. A whole c
ommunity of Proxomitron users has built up, and a quick search will soon point you in the direction of numerous additional filters and tweaks, strewn across the Web.
**The Proxomitron is that rarest of beasts: a free, usable, effective tool that not only does everything it promises but pretty much anything else you can throw at it too. I'm uneasy suggesting you should make a habit of changing every web page you don't like the look of, but it's quite possible to do so once you've got to g
rips with this software - you can switch off frames, 'de-table' a page, turn confirm action boxes off, change JavaScript code, turn off style sheets and so on.
**Of course, once your obsession builds to that degree, it's beyond control and has become something a little more sinister. One can't help but think that if you hate web pages so much, why not throw away your browser or visit only pages you designed yourself. But used in moderation, you won't find a more effective and useful tool
on the Web, and you'll join me and my friends in body-snatched bliss.
**Problems, Problems...
*It seems the saga of the Net Send/RPC spam is set to run and run - or perhaps that should be 'pop up' - all over the place. Despite myself and just about every other RWC contributor having banged on about this for months, my mailbox is still awash with complaints, concerns and requests for a cure. As has been said time and time again, the cure is simply to ensure your firewall closes port 135.
**Although closing down the messenger service will cure the problem (and I've mentioned that here before), it isn't a grown-up solution and should only be thought of as either a knee-jerk first response or a last-ditch effort. Most of the utilities that say they can cure the messenger spam just close down the service and so are a waste of time and money.
**To see if you're vulnerable - as if you wouldn't already know by the volume of this rubbish that appears - pop over to <A HREF="htt
p://www.mynetwatchman.com/winpopup.asp" target="_BLANK">www.mynetwatchman.com/winpopup.asp</A>, where a quick test can determine your position. I also thought that the hard sell at <A HREF="http://www.directadvertiser.com" target="_BLANK">www.directadvertiser.com</A> should be essential browsing for anyone who doubts the scale of the problem - it truly makes for sombre reading.
**Another problem that seems to be bugging readers, clients and my own family is the old chestnut of Microsoft a
pparently not being able to make its browser print properly formatted pages. This has been going on for longer than I care to recall, but you'll still find a large percentage of pages that you try to print will chop off an inch or so of content towards the right margin. The problem is such that I always preview before printing and, if the problem appears, deal with it before hitting the OK button.
**The answer's simple, if somewhat annoying, and is admittedly a bodge - but hey, it works,
so I'm not complaining. Just set any rogue pages to print landscape rather than portrait and there'll be enough room for those wandering margins.
**The alternative is to use a different browser, and PC Pro has reviewed plenty in the last six months. Unfortunately, my favourite, NetCaptor 7.1, uses the IE rendering core so suffers the same problem. Both Mosaic and Opera, funnily enough, manage the print thing with ease. Are you listening Microsoft? For all our sakes, and to regain a litt
le credibility with my five-year-old son who can't understand why his brilliant computer can't print a page properly, get this sorted in the next Service Pack release.
**Several readers have been in touch since I looked at MailWasher recently, mainly to shake my hand for bringing it to their attention while slapping me with their other hand because it's no longer free and seems to be a different product. I can assure you all, it is the same application, but the branding has changed and no
w comes under the Firetrust banner.
**As for its going commercial, who can blame the developers for finally, after many years, coming to the conclusion that they need a properly structured income stream in order to pursue future product development? Certainly not me, as I'd be more than happy to part with my cash for this application. After all $30, a tad short of
20, is hardly a king's ransom when you think of the freedom from spam it can bestow.
**And there's evidence that these guy
s mean what they say about development strategies. As I write, I've been in touch with MailWasher's creator and joined the beta programme to help test the forthcoming Content Filtration System (CFS). This will tap into the well-established community of MailWasher users - there have been more than two million downloads of this application - to report spam into a central database. Okay, this isn't a new idea and there are plenty of similar schemes out there already with varying degrees of su
ccess (see Google for more information), but from the little I know so far it does look like this one could work.
**The idea is that once a spam is reported and, most importantly, verified as such, no other MailWasher will ever get one from that particular originator again. The verification process is vital, and CFS won't rely on computer filtration and pinging to do this, but instead will involve a two-person process; namely, the MailWasher user reporting the spam and the CFS administra
tor checking it.
**Firetrust has developed a ranking system to ensure the most common spams are verified first, and using real people should prevent blocking of such things as list servers with which a user is fed up - a toggle switch can allow the admin staff to turn off any user's ability to report spam should they abuse the system. This does seem slightly at odds, though, with a declaration that all reports will be anonymised before being submitted to the database to protect the sende
r from retaliation: I guess such things will get sorted in the wash as it's early days yet.
**I like the idea that spam will get categorised by subject, so it will ultimately be possible to block 'jokes' or 'gland enlargement', but I'm not so keen on this being a chargeable service. With so many free alternatives around, Firetrust is going to have make this one 100 per cent spam-proof to pick up any client business. Perhaps more saleable will be the server version, which enterprises coulaOd find useful in keeping the lid on spam in the workplace. Only time will tell.
HQAfter a month laden with problems, Davey Winder is visited by the body snatchers.L July 2003X
nature - layouts automatically flow to fill browser windows of any size - is actually very effective for the Web as it was originally conceived; that is, for disseminating scientific papers. However, such crude, single-column layouts are hardly exciting and become virtually unreadable at the line lengths supported by modern high-resolution screens.
**The original layout solution was pioneered back in 1995 when Netscape introduced new tags for dividing the screen into tables, which were qu
ickly adopted by other browser developers and incorporated into the HTML 3.2 specification. HTML tables are tag based so it's possible to create them from scratch in a text editor, but it's much easier to produce them interactively in a visual authoring environment.
**Every HTML-authoring package lets you create tables via an Insert | Table command where you specify the number of rows and columns. On switching to Code or Split view, you can inspect the nested tags generated to create a s
*The main element is the outer <table> tag, which takes attributes that set the overall width (here 100% to fill the fu
ll width of the browser window); the border (here 1, so you can see the table grid), and the cellspacing and cellpadding that control spacing between cells and cell content.
**Interestingly, there are no <table> attributes for the most fundamental factor - the number of rows and columns. Instead, these are handled incrementally as they're downloaded in keeping with HTML's streaming nature. The opening <tr> tag tells the browser to start a Table Row, followed by any number
of cells enclosed in <td> and </td> Table Data tags (the " " code for a hard space is added to prevent browsers ignoring empty cells). When the browser hits the closing </tr> tag it can render one completed row and start on the next, lining up the new <td> data cells with the previous ones to produce de facto columns.
**This looks straightforward, but it's worth experimenting by creating a multicelled table and randomly scattering varying amounts
of text through it. Within your authoring program, columns start off equally spaced and each cell will grow vertically to accommodate the text as you'd expect, but preview the page in a browser and the results are very different - columns containing more content are wider than those containing less. Resize the browser window and the column widths will shrink and grow, both absolutely and relative to each other (and different browser versions might well produce different results) - bizarre
behaviour that has reduced generations of designers to nervous wrecks.
**What's going on? Remember that <table> tags were invented to handle scientific data, and the browser's function is to show the tabular data on whatever screen size with which it's presented. It has its own ideas on how best that should be done - as Netscape puts it, 'complex heuristics are applied to tables and their cells to attempt to present a pleasing looking table'. The problem for designers is that the
results of this 'smart formatting' are often anything but pleasing, and clearly they need to take tighter control of the sizing of tables. In fact, this is relatively straightforward - at least in principle - as both <table> and <td> tags offer a 'width' attribute.
**We've already seen the width="100%" attribute, which makes a table fill the full browser window. Other percentages allow you to set each <td> cell to be a proportion of the total table width. More sign
ificant still is the ability to set an absolute pixel size, so <td width="500"> will create a cell/column 500 pixels wide irrespective of the size of the browser window. Width isn't absolutely fixed as it would be in a DTP application, though, because cells can still expand if they're too small to contain their content (say, an embedded bitmap image), or if some later cell in the same column has a wider setting.
**Nevertheless, this is a huge step toward the control you need for l
aying out web pages. The first decision is the most important - assuming you want a fixed-width layout, how wide should it be? You're still faced with the fact that screen sizes vary, and if you set a table width larger than the window your users will have to scroll horizontally to read each line (which they won't do for long before fleeing). An obvious solution is to aim at some lowest common denominator, which on desktop PCs is 640 x 480. Then you must consider scroll bars and different
browser setups, again aiming at the lowest common denominator (in this case, Internet Explorer 4.5 on a Mac). Playing it very safe leads to a maximum pixel width of around 580.
**Side-by-side
*But of course, the main advantage of a table is side-by-side text. The classic web layout is a separate column of alternative page links to the left of the main body copy, easily achieved with the Table | Insert Column command that automatically adds extra <td> elements to the existing <
tr> elements (much simpler and safer than hand coding). By default, these <td> cells have no width attribute, but you can change this by interactively dragging on the cell side. Note, though, that in many applications the resulting widths won't add up to the full table width, so it's better to set them manually. A width of 130 pixels for the links column leaves 450 pixels for the main copy column, which might not sound much but at the default browser text size leads to readable
line lengths of around 12 to 15 words.
**This two-column by one-row layout is still pretty basic, but you can quickly add more rows and columns via Table | Insert commands and, crucially, merge adjoining cells using Table | Merge Cells. A glance at the code shows that this is achieved by two new <td> attributes: colspan where a cell straddles multiple columns, and rowspan where it straddles rows. An obvious use is for a table of scientific data where you'd create a first title ro
w spanning the width of the entire table.
**However, that's not the only use for cell spanning. There's nothing to stop you merging cells anywhere in a table grid, which opens up much greater design potential. You can similarly use Table | Split Cell... to further subdivide a table grid, achieved not by any new attribute but by adding extra blank <td> elements, then applying appropriate colspan and rowspan attributes to surrounding cells. Alternatively, if you want to split an exi
sting cell into multiple rows and columns, simply use Insert | Table, repeating this process to create multiple-nested tables.
**Returning to our simple two-column layout, you can add a new row with Table | Insert Row, then merge these two cells to create a single-header cell straddling the whole layout - ideal for a typical web page banner. Then insert another row above the page links and body copy rows to contain links to the main site sections. Finally, merge these cells and insert a n
ew one-row by six-column table into which you can enter the links.
**Using these few commands, you've created the basic banner/ section links/page links/body copy framework that you see on so many websites and, more to the point, you've learned all that's necessary to create far more advanced page layouts.
**For your home page, for example, you could subdivide the main body copy cell into a grid into which you drop taster stories and links. Or you could increase the width of your layout
by adding a new column down the right, which means targeting larger 800 x 600 displays. However, if you make sure that no column is more than 580 pixels wide, users of smaller displays will only have to use the horizontal scroll bar once to set up their reading of the main copy. There's one final thing you need to do: set all <table> elements' border attributes to zero to hide the grid on which each page is built.
*Using these basic tools, you can build any layout so long as it's
based on non-overlapping rectangular components. You've gained pixel-level control of your design, at least horizontally, and you can achieve similar vertical control thanks to the <td> element's 'height' attribute (again with the proviso that the cell will always expand to contain its content). This is almost DTP-level control, so the obvious question is: why can't we have a DTP-style implementation?
**In fact, this is exactly what wysiwyg authoring packages like NetObjects Fusio
n have always offered: drag text and picture blocks onto the page and, behind the scenes, Fusion generates correctly sized <table> tags to reproduce that layout in a browser. Rather more advanced is the system pioneered by Adobe's GoLive, and then picked up by Dreamweaver with its Table Layout view, where you drag a Layout Table container and then, within its area, drag into position and size Layout Cells to hold the content. As you do this, you can see the outlines of the complex
colspanned and rowspanned table grid that will produce the final desired results.
**We've come a long way from those original browser-interpreted columns of scientific data. Tables are the secret behind the layout of almost every web page, but before getting too pleased with ourselves let's take a reality check by looking at the code. It isn't a pretty sight. For example, the HTML for a typical grid layout produced in the Table Layout view consists of reams of <table> tag code, wi
th vast numbers of empty <td> cells outnumbering a few tiny oases of content.
**This isn't exactly how Tim Berners-Lee imagined the semantic Web, but so what if the code is complex - if it does the job, does it matter? Well yes, and for one overriding reason: efficiency. For a start, complex table code affects efficiency in terms of editability - the more advanced the table layout, the more difficult it is to edit. Table cells aren't actually self-contained DTP-style text and pic
ture boxes, so every time you want to move or resize one it will have repercussions on the layout as a whole. Even minor adjustments can be a nightmare (and you can forget about easy repurposing).
**Under the table
*However, the complexity of HTML table grids is even worse news when it comes to rendering. We've already seen that, unless pinned down, different browsers will interpret the same tags differently, but with more complexity there's more room for disagreement - like how to handle
mixed pixel and percentage attributes, or column widths that don't add up to the table width - and hence more danger that the layout will break.
**On top of which, remember that even the best table layout is never absolutely fixed. All it takes is for the end user to change their default text size so that a cell has to expand to contain its content and your carefully constructed table can fall apart like a pack of cards. Things are certainly better than they were, but it's still necessa
ry to check your table layouts against the most common viewing setups to ensure there aren't any nasty surprises lurking.
**The real problem with tables is even more fundamental than this: browsers were designed to efficiently display tagged text, but these days, because of tables, many web pages contain more tags than text. In other words, the browser's primary task has come to be rendering the table rather than its content.
*That's bad enough, but the way HTML tables render incremental
ly row-by-row as the code downloads proves to be fundamentally flawed in highly designed documents. The system works as intended for regular, fixed-column tabular data, but in advanced layouts the browser might well find that later information - a new column width (whether specified or content dictated) or even an entirely new column - can retrospectively affect everything that's already been rendered. In Netscape, you often see this as disconcerting on-screen re-juggling, while in Explore
r table rendering is usually delayed until every <td> in every last nested <table> has been parsed.
**Suddenly, HTML tables appear in a far less attractive light. Rather than being saviours, <td> tags are more like unwelcome parasites overwhelming their host with reams of inefficient and unreliable code that flouts HTML's original principle of streamlined, flexible and fast rendering. Surely, there has to be a better way?
**There is and it's called Absolute Posit
ioning, which shifts web page layout control out of HTML and onto the dedicated CSS2 (Cascading Style Sheets Level 2) in much the same way that text formatting is best handled by CSS1. Absolute Positioning is definitely the way of the future (and a future column), but until reliable browser support arrives and a critical mass is reached HTML tables will remain the layout mechanism of choice for the majority of web pages. So the obvious question becomes: what can you do to optimise their us
**The secret is to look at your tables from the browser application's point of view. To render the table as efficiently and reliably as possible, the browser needs to know as much information as soon as it can. That means specifying all attributes like cellpadding and cellspacing so that there's no scope for a varying interpretation. You also need to specify column widths once and for all and as early as possible, preferably in the first row, and to make sure you don't mix percentages
and pixels and that your figures add up. And you need to be certain the content you intend to put into your cells will fit.
**You can also help the browser by adding HTML 4's <col> and <colgroup> elements to specify the number of columns and their width upfront, although authoring and rendering support is currently patchy. If you've set up your table and column widths correctly, you can use CSS2's {table-layout: fixed} property to switch supporting browsers to faster incr
emental rendering. Adding information in this way helps the browser render your table, but the best approach is still to keep it simple. Most importantly, you should avoid the code bloat involved in mimicking wysiwyg DTP-style content box layouts.
**This principle can be taken further: by their nature, features such as rowspans and nested tables delay rendering, so the more you break up your tables into simple incremental units the better. With that classic banner/site links/page links/bo
dy copy layout, for example, you could actually treat each cell as a separate table (you'd need to left-align the page links table so that the two tables appear side-by-side). This means each page element can render independently as it's downloaded rather than being tied to all the others. The actual rendering time for the page as a whole will be roughly the same, but the fact that some content appears quickly can make the psychological difference between a text-heavy page being read or ab
andoned.
**HTML-based tables have a huge amount to offer the designer (try imagining the Web without them), but it's important to recognise that they're a double-edged sword. Ultimately, the most important tool when it comes to designing your web page layouts isn't your powerful-but-profligate authoring package but a pencil and paper, with which you can hone down your design to its bare but elegant essentials.
HbAre HTML tables God's gift to web designers or the work of the devil? Tom Arah plots a middle way.L July 2003X
105 (July 2003)i
Publishing/graphics
Simon Brock and Ian WrigleyD
Track requests for free
One of the more bizarre criticisms levelled at open-source software is that it isn't very useful. The implication is that while you can get free open-source operating systems, programming tools and web servers, the applications that run on them still tend to be paid for. While there's certainly some truth in this - we write paid-for applications ourselves - there are open-source applications that solve real problems, like the one we're going to look at this month. It's a ticket system for
**The idea behind RT is simple. Imagine you're part of an organisation where people constantly request other people to do something - clients asking questions of suppliers, for example. Assume for the moment that the requestor sends this query as an email. RT handles the email and allocates a number to the request. It then replies to the requestor telling them their request has been received and its allocated reference number. The request is
forwarded to the people in the organisation who might deal with it and RT copies their replies to each other and to the requestor.
**All of this could be done using email aliases, but RT stores these emails in a database that can be accessed via a web interface, enabling the people in the organisation who administer requests to see all the correspondence about each request and, if necessary, change their status. While a request is being worked on, it can be marked as 'open' and, when it'
s been finished, marked as 'resolved'. Once a request is resolved, an email is sent to the requestor so they know their request has been dealt with.
**By now, you'll have formed one of three opinions: you think RT could be useful; you don't think it could be useful; or you think it smacks of the worst aspects of being held in a call-centre queue while you wait to complain that the PC that you've just had delivered doesn't work.
*We're obviously of the first opinion, as RT forms part of t
he daily working life of our company. We use it every day to track requests from our clients, and also to monitor our systems. Whenever a machine is running low on disk space, it opens a ticket that will get resolved by someone when some extra space has been made available.
**Installation
*If you've read this far, you'll probably want to install RT, which you can download from <A HREF="http://www.bestpractical.com" target="_BLANK">www.bestpractical.com</A>. To install the software, you'll
need to carefully prepare your system - RT is written completely in Perl but you'll need more than just a Perl interpreter to make it work.
**First, you need a bunch of Perl modules, but the software comes with a script that tells you what those modules are and helps you download and install them all. Second, you require an SQL database, but RT supports those open-source favourites MySQL and Postgres and you should be able to find one of them.
**Last, you need a web server that's optim
ised for running Perl scripts, and the most popular way of doing this is to incorporate a Perl module into your web server. If you're using Apache, this is best accomplished with mod_perl, which is also available for other servers.
**For those of you unfamiliar with this facility, it builds a Perl interpreter into the web server. So, whenever a request comes in that would normally run a Perl script, the script is run within the server process itself rather than a new process being spawne
d, which then runs the Perl script. This provides a sizeable performance increase when running Perl scripts on websites, but the downside is that a server including mod_perl tends to use a lot more memory.
**Economies can be made by not building the module into the server itself (see Adding modules to Apache), but beware that using a 'dynamically loading' mod_perl module can induce unreliability, although we've used it without any problems.
**Other possibilities are available, such as t
he fast_cgi system (see <A HREF="http://www.fastcgi.com" target="_BLANK">www.fastcgi.com</A>), but you'll probably find that mod_perl is a simpler solution. Once you have your web server set up to run the application, you just need to run a couple of commands to install the software.
**The basic idea of this division into two groups is that the former do the actual work, while the latter check that the former are doing the work. You need to introduce RT to these users and give them userna
mes and passwords to access the web interface.
**A complex system of rights and controls can be associated with each queue and each user - which are well beyond the scope of this article - but it's best to keep things simple. Just ensure that the group 'Everyone' has the ability to create tickets or the next step won't work.
**Once you've completed the above configuration, you should have a system by which users can log in and create tickets. These tickets can then be allocated to peopl
e to work on, and they can exchange messages with each other and the requestor(s). It should be noted that once a ticket is allocated to a queue, two types of correspondence can be entered against it. The first is a reply that's sent to everyone associated with that queue, while the second is a comment that's only sent to the administrators of the queue.
**The latter is important, because it allows the following style of interaction. Someone sends in a request and that request is taken b
y one person, who replies to the requestor and says they're working on their request. They then decide they need some help, so they comment on the ticket. This comment is then sent to all the administrators and they can comment back. Then, when the person working on the ticket finishes, they can send a reply and resolve the ticket.
**This distinction between internal comment and external reply is important and allows a ticket to include helpful comments for internal use. Of course, it's
also open to abuse, allowing comments of a 'personal' nature to be included on the ticket. Personal experience suggests that this should be discouraged, as occasionally someone forgets the difference between an internal comment and an external reply.
**At this stage, you need to link your RT system to your email package. RT is a based around its website, but the incoming email interface is important too.
**First, it allows incoming requests that are made by email to be automatically ent
ered into the system: whenever a request arrives, a message is sent to all the watchers on the queue. By linking the system into the email, these watchers can reply to these messages, which will be automatically entered onto the ticket and sent to the appropriate people. To implement the link to the email system, you need to set up special aliases on the system that runs RT.
**To understand how this works, let's assume you have a queue called 'ClientA Requests', which is where you want to
handle requests from client A. You want to give to that client an email address they can use to send in requests, which would typically be something like clienta.requests@ yourdomain.co.uk.
**What you need to happen is that whenever email is sent to this address, a script is run that interacts with RT to create a new ticket. Usually, the best way to do this is to ensure email is delivered to the machine that's running RT. If this isn't the case, you have to start from your main mailer a
nd set up aliases that deliver mail to that machine. Assuming this machine is also a Unix machine, most mailers have an alias file that maps a name to an email address. Typically, you need to add a line that looks like this:
*This delivers all email for that address to the RT server. Obviously, the RT machine also requires a working mailer, but we'll leave that as an easy exercise for the reader. On the machine running RT
we must also set up an alias, which will typically look like this:
*clienta.requests: "| /path-to/rt3/bin/rt-mailgate -queue 'client A Requests' -action correspond"<P>
*Note how we've used quotes to include the alias and the queue name in the alias. Generally, you can set up two aliases for each queue - the one above is for original requests and replies, while a second can be set up for comments.
**Once you have RT set up, it's ready for use and requires little maintenance. This isn't to
say it doesn't need to be managed, because if people don't resolve then tickets queues can soon fill up.
*It's also possible to change the way the system works when events occur. For example, when a ticket is resolved, we send a response that contains the complete history of the ticket, showing what was done when. It's also possible to set up a system by which requestors get access to the website in a kiosk mode, where they can see their own tickets.
**Once you've got RT working, it's w
orth having a look at an add-on called RTFM. The aim of this software is to allow you to collect together snippets of information from various sources and edit them to produce documents. It can be used with RT by extracting contents from tickets to form these documents.
**If you have any problems, there are mailing lists available plus commercial support from the author of the software.
**So there you have it - a real and definitely usable open-source application. If your company has t
TNo handle requests from various sources, there's no better or cheaper solution.
HGSimon Brock and Ian Wrigley sniff out a useful open-source application.L July 2003X
105 (July 2003)i
Mark Newton and Paul OckendenD
FrontPage 2003
Last month, we promised to examine FrontPage 2003 in a bit more detail and, more particularly, its claim that it can be used as a graphical XML template builder. But before focusing on this specific area, what's FrontPage 2003 like overall?
**First impressions are that its developers have been spending their holidays at Macromedia, as the product bristles with similarities to Dreamweaver. For instance, its design area shows Display (rendered) and Code views simultaneously; table design is
a simple draw-and-drag operation (even the icons look similar); and there's a tag inspector. Such similarities are no bad thing, though, imitation being the sincerest form of flattery.
**We wondered whether Microsoft had fixed some of the design faults found in previous versions of FrontPage, the most annoying being the ease with which great chunks of code could be inadvertently removed when in Design view. Has Microsoft found some way to show where code is hidden in the Display mode of
editing? No such luck - it's still possible to delete acres of code without any indication or warning. Other products use a small icon to show where there's code present behind some visual element, but the only way to get a similar warning in FrontPage is to switch 'show tags' on. This shows all the HTML tags in Display mode and is hence unusable, as it destroys the ability to see what the page will look like. If you need this much detail, use Code view and learn to read HTML.
**When ty
ping ASP code into the code panel, it would be reasonable to expect some form of code hinting as in Visual Studio and Macromedia's Dreamweaver, but no, all code is shown in one colour without any 'Auto Complete' hints to help with the syntax.
**FrontPage still relies heavily on web bots to achieve many dynamic functions like talking to databases. In the past, this meant you had to have FrontPage Extensions installed on your web server to use the bots, but with FrontPage 2003 the action o
f publishing your site puts all the necessary files (previously known as extensions) onto the server for you. The ASP or ASP .NET code still refers to web bots and looks nothing like what you might write yourself to access a data source, but it's a definite step forward from trying to make sure you had the right version of FrontPage Extensions installed on your live web server.
**Nevertheless, much of the functionality of FrontPage 2003 requires even more Microsoft technology installed o
n your web server before it will work, which means that hosting such sites could still be tricky and makes FrontPage more useful for intranet design using Microsoft Office technologies rather than for Internet deployment.
**Adding DHTML effects is tricky with some design packages, and here's an area where FrontPage is rather good. Mouse-over, click, double-click and page-load events can all have effects applied to them by simply selecting from a series of drop-down boxes. The selection of
border effects is particularly good, as you see a preview of how the object will look rather than guessing until you've saved the page and brought up a browser window as in some other packages. Once a DHTML effect is applied, the object is colour-coded in the Design mode so that you know an effect has been applied; to see the effect in action, a click on the Preview tab shows your page as it should look in a browser with all the DHTML effects working. There's even an option to preview in
multiple browsers, displaying your page in any set of the browsers you've installed on your PC at different screen sizes - a nice touch. This all makes adding effects so easy that we feel obliged to advise restraint - as with fancy fonts, overuse will do nothing good for your design.
**Now to the crux. You may have heard that FrontPage can now design XML templates (XSLT) in a graphical way, making it the first such mainstream product to do so. This is exciting stuff, so we tried to do it.
The problem is that all the 'data view' functions in FrontPage 2003 require Microsoft SharePoint 2 to be running on your server. This will apparently be part of the OS under Windows Server 2003, but at the moment, before we can design an XML template to format XML data, according to the Help file we need Windows Server 2003, SharePoint and IIS 6. That's a lot of server-side technology just to create an XML template to transform an XML data file into something more visually pleasing, both
of which are simply text files. It's all very disappointing and not exactly easy. We'll be setting up a Windows 2003 Server with IIS 6 next month and should then be in a position to tell you how FrontPage 2003 integrates with it and what's involved. Bear in mind that the current version of FrontPage 2003 is a late Beta 2 and so some things may be fixed by the time of release: if they are, rest assured we'll be telling you here.
**VorteXML
*While on the subject of XML, VorteXML 2 was launc
hed recently, which simplifies the business of converting legacy data into XML format. VorteXML is published by Datawatch (<A HREF="http://www.datawatch.com" target="_BLANK">www.datawatch.com</a>), the same firm that brought us Monarch. Just to recap, Monarch enables you to design templates that map data fields exported in text form from a legacy database onto the fields of a new database format of choice. Monarch then extracts the data from the text file and imports it into your new datab
ase. It's an extremely useful tool for getting data out of legacy systems, and VorteXML does basically the same job, but instead of importing the data into a database it produces an XML file for you to do with as you please.
**VorteXML comes in two versions, both easy to use. The first, Designer, costs ?345 (exc VAT) and is useful for one-off solutions where you just need to do a data conversion from an old system to a new XML one. Mark took a text dump from a database and within a few m
inutes was able to produce a fully validated XML file and even attach a CSS file to it for formatting. The second version, Server, produces XML files from text files as part of an automated process. It also has a full API to facilitate integration into your existing system. This version costs ?5,000 (exc VAT) per server and ?1,299 (exc VAT) per extra CPU. Like a lot of similar products, a few of you will find this invaluable while the rest will wonder what all the fuss is about. What we ca
n say is that VorteXML is easy to use, has the flexibility to handle most jobs and nicely fills a niche in the XML marketplace. It's an obvious successor to Monarch, creating an XML data source rather than Monarch's database data source.
**e-hairdresser.com?
*Most of us, when pondering the commercial aspect of the Web, find our thoughts instinctively drawn to e-commerce and the potential for selling goods and services from lucrative online stores. There are, however, some businesses that
are unsuited to major e-commerce application; for example, hairdressers, public swimming baths and pubs. That's not to say the Web can't do anything at all for such businesses. It just means they can't take the easy option of flogging their main product or service from their website.
**There might, of course, be some scope to sell ancillary items: the swimming pool could sell season passes or goggles; the pub might sell Firkin T-shirts; and even the hairdresser could find certain fringe b
enefits (ho, ho!). But the main use for all their sites is going to be promotion - they all need to get people through their doors. And to do this, one of the essential things any 'get footfall' site needs is a location map. People are much more likely to show up if they can find your place. Some sites provide click-through links to Streetmap (<A HREF="http://www.streetmap.co.uk" target="_BLANK">www.streetmap.co.uk</a>), Multimap (<A HREF="http://www.multimap.com" target="_BLANK">www.multi
map.com</a>), MapQuest (<A HREF="http://www.mapquest.co.uk" target="_BLANK">www.mapquest.co.uk</a>) or any of the other online map providers.
**The problem is that this isn't very professional and you might also lose your visitor at this point, as these map providers all carry some form of advertising. It's far better to provide ad-free maps embedded within your own pages. You can do this while still using these online mapping companies, as most of them provide a service whereby you can
use their maps on your own pages. You have to pay, of course, but in terms of creating a professional image it's head and shoulders above just linking to the map sites.
**We've used Multimap for some of our own projects, which operates on a pay-per-view basis, and its system seems to work well. We just call Multimap's map server, including the address for the desired map, and get a nicely formatted graphic returned. It also sends us our monthly statistics by email, showing our map usage
across various domains. If you only have a few locations, though, there may be a cheaper option, and strangely enough it comes courtesy of Microsoft. When the press release for AutoRoute 2003 landed in Paul's mailbox, he noticed that it said 'Maps produced in AutoRoute can be ... published to the Web using the "Save as Web Page" feature', which was intriguing.
**We'd only ever thought of AutoRoute as a 'get me from A to B' product, but now we started wondering whether it could be used to
produce maps for commercial websites. So we checked with Microsoft's press office, enquiring what exactly the licensing restrictions were. The response from Rik Temmink, global product manager for AutoRoute 2003, was as follows:
*'The licence agreement allows a user to "generate and post online" up to 1,000 pieces of content from AutoRoute. So in layman's terms, you can save maps and driving directions in HTML format from AutoRoute, and put 1,000 of these saved maps/directions on your
website. Not 1,000 items at a time or 1,000 items on each website, but 1,000 items over the lifetime of the licence. After you reach the 1,000 limit, the AutoRoute licence is obviously still valid for other tasks, but you'd have to purchase another licence to get another 1,000 "posts" in the bank.'
**There you have it - if your company has less than 1,000 branches, the easiest and cheapest way to produce location maps for your website is to nip out to your nearest software shop and pick u
p a copy of AutoRoute 2003, a so-called consumer product. If you have 2,000 branches, just buy two copies! At ?40 for the software, the cost of each map is around 4p!
**Those bad link blues
*Paul provided ten of the '100 tips and tricks' that you'll have found in PC Pro's glorious 100th issue, and one of those hints has resulted in quite a few email requests asking for further details, so here goes.
**The tip was: 'Bad links can be serious, so start to spot those 404 errors as soon as t
hey happen. It's better to find out that you've messed up an important link on your front page before you run your monthly logfile analysis. Or before you notice a drop in the order volumes. Or before you notice a drop in the company bank account. Reconfigure your web server to use a special 404 page. Then make that page run a script to send you an email.'
**Now let's look at that in a bit more detail. With most normal web server configurations, if a user tries to access a page that doesn
't exist a normal 404 error page will be displayed. See <A HREF="http://www.ockenden.com/error.html" target="_BLANK">www.ockenden.com/error.html</a> for an example - this page doesn't exist and the response is the standard IIS 'Page not found' error page.
*But if you try the same thing on Paul's company site <A HREF="http://www.cst-group.com/error.html" target="_BLANK">www.cst-group.com/error.html</a>, you'll see the error is handled more gracefully. What you don't see, though, is that t
he page will have also generated an email that went to the site developers (that's Paul and his team), informing them of the error.
**How's that done? Well, it's quite easy. First, you need to configure your web server to use a special 404 page. With Microsoft's IIS, do this via the normal management tools - right-click on your website within the IIS management tool and then select the Custom Errors tab. Scroll down the list to find 404 and change it to an URL pointing at something like:
/errors/404.asp
*If you're running PHP on an Apache server, you must make a change to .htaccess, a file that enables you to override all kinds of Apache server instructions and defaults. In typical Unix fashion, you'll need to be careful with its filename - it has to be all lower case and don't forget that it starts with a dot. The best place to make this change is in the root directory of your site, because a .htaccess file in one directory affects all the subdirectories below it (unless,
of course, they have their own .htaccess file). To set up the custom error page, create a new .htaccess file (or modify an existing one), adding an ErrorDocument line. The format of this is: ErrorDocument errorcode url. So in your case, you'll write something like: ErrorDocument 404 /errors/404.php
*Okay, so now that you've got your web server using a special 404 page, how do you get it to email you with details of the bad link? Easy peasy. In PHP, you do something like this:
**mess = "404 error on "&servname&chr(13)&chr(10)_
*&"Page: "&page&chr(13)&chr(10)_
*&"Referrer: "&ref&chr(13)&chr(10)
**Set objMail = CreateObject("CDONTS.NewMail")
*objMail.From = "error@domain.com"
*objMail.To = "webmaster@domain.com"
*objMail.Subject = "404 error"
*objMail.Body = mess
*objMail.Send
*Set objMail = Nothing
*%><P>
*These are both blocks of code that you can embed in a normal page, and you'll notice that they follow a similar structure, with
a couple of subtle differences. First, the way IIS maps error pages means you can't just query the script name of the current page in order to get the page that was requested, as it will merely return the name of the 404 handler page. Gee, thanks, Microsoft.
**Instead, IIS adds it on to the end of the querystring, following a semicolon. Once you know that, it's fairly easy to retrieve it. Second, note the default mail() function in PHP doesn't allow you to set the from: address directly
. Instead, you have to set it as a 'header', hence the special formatting of the last parameter in the mail() call.
**Once you've set all this up, you'll notice that you get all kinds of 'rubbish' bad link alerts - robots looking for robots.txt files in places where you don't have them, or 'Code Red' and script kiddies trying to hack your servers using the usual list of exploits. You'll probably want to add some conditional code into your 404 page so that such bogus automated 404s don't g
`:enerate emails. Otherwise, you'll get a very full mailbox.
HGMark Newton grabs the FrontPage while Paul Ockenden maps out his futureL July 2003X
105 (July 2003)i
Web Business
Jon HoneyballD
Onward and Upward
I'm now utterly convinced that we are about to move into a completely new and exciting era in computing. Sounds like a big claim you might think, but the pieces are definitely slotting into place and there are already credible arrival dates for all the critical technologies. I've been fortunate enough to witness several such seismic shifts before, and each time there's a clear and decisive step-change in the technology and mindset.
**Back in the days of MS-DOS people did plenty of good w
I'm writing this month's column from San Francisco, where the world's IT press has gathered to find out all there is to know about Office 2003 or, rather, all that Microsoft wants to tell us. We've endured two days of wall-to-wall PowerPoint presentations and demonstrations to bring you the latest news.
*Office 2003 beta 2 will be available from 10 March 2003. Subscribers to MSDN should receive it in their next shipment, or you can order a set of CDs for a modest fee. This is, after all, a
ork using applications like WordPerfect and Lotus 123, but the move to 16-bit Windows computing with Windows 3.0/3.1 in 1990 brought many benefits over and above the graphical user interface. Its unification of printer driver models was, for many, a huge benefit, let alone everything else that you got: unified screen drivers; a mouse-driven GUI; and finally some semblance of proper memory management. I know there were many unpleasantnesses hiding under Windows 3.0's memory manager, but it
did allow applications to share memory in a cooperative fashion, so that proper multitasking no longer required a third-party task switcher like DesqView. Most important of all, it preserved and built on the best fundamentals of the DOS era - applications were mature and robust and we understood clearly what we were supposed to do with them. I'll hurriedly skip over the fact that many of the first Windows implementations from well-known application vendors were grotesquely badly designed,
despite several years of clear demonstrations how to write good Windows applications from products like Excel and PageMaker.
**The next significant era arrived in the mid-90s with the shift on the client side to 32-bit computing and the appearance of robust and serious server technology. Once again this built on the best of the previous generation, and Netware 2.x, 3.x and 4.x were important leaders on the server front. The arrival of the Internet just spurred the need for robust desktop
technologies, and we were firmly into the era of Microsoft's Office suite. This same model has served us well since then, but now it's time to move forward, and the next big shift is going to start this summer, push through for the next two years or so.
**The key technology is XML, the extensible markup language which finally and forever breaks the relationship between information and the way it's presented. You're now probably saying yourself that XML isn't something new this month, th
at it's been around for some while so why am I only now proclaiming it to be the future of computing? Well, XML by itself is not enough, and nor is the provision of development tools either. For XML to change the world it needs to impact on everyone's day-to-day computing experience, and it's here that the next pieces are falling into place. Before I get too excited though, I should admit that the picture is still not fully complete, but the end-game is clear to see, the clock is ticking a
nd it's time to board the train.
*The Office is Open
*Some months ago I expressed concern about rumours that were circulating regarding Office 11 having a heavy XML emphasis. I was concerned largely because Microsoft has traditionally not been the most open of companies when it comes to new data formats (and that's putting the matter in the kindest possible language). Take a data format like that used by Excel: from the beginning this has been known as BIFF, for Binary Interchange File Fo
rmat. BIFF is a private binary data format and it's not published anywhere. As if that weren't bad enough, with the arrival of structured storage Office documents, all BIFF information was poured into a stream inside a structured storage file, and to get to the structured storage streams you needed to use Microsoft's development APIs, thus making it doubly hard to read the data held in a worksheet. You could get a document from Microsoft that described the BIFF format, but this required si
gning an NDA (non disclosure agreement) and thus was only made available if you were considered a 'friendly'. Just to add insult to injury, it appears that a lot of the BIFF format is either not documented at all or is somewhat incomplete. The bottom line was clear - Microsoft didn't want you to read BIFF files except from Excel, and the same applied to the document format of Word, and the other Office applications.
**Claims from Microsoft that it has now done a full 180 degree U-turn mi
ght rightly be treated with considerable suspicion. The first hints of this came a few months ago when the first beta of Office 11 showed that it could output XML data in a clear and coherent fashion, and that the text didn't appear to contain anything proprietary. After all, there is little point having an open, text-based XML file format if it's littered with large binary strings like <PRIVATE_MS_FORMAT> AB0F111944356FFCDB05566 </PRIVATE_MS_FORMAT>.
**At that time I said that the proof
of the pudding remained in the eating, and that I wouldn't be convinced until it was clearly proved that Microsoft understood the meaning of the term 'open' when it came to document file formats. Well, I'm here again to tell you that I think it really does now. And it comes down to the work of one man, Jean Paoli, an ebullient and passionate exponent of XML, who has been driving Microsoft's XML vision. You might expect this from him - after all his job title is Microsoft XML Architect - b
ut titles are one thing and credibility is another. He was the co-founder of the W3C XML 1.0 standard, and when you listen to him speak on the subject, you're left in absolutely no doubt that Microsoft XML data is open and ever shall remain so.
**The extent to which Microsoft has managed to shoe-horn XML support into the forthcoming Office 11 product is something that really surprised me. I can now sit inside Access and output some data as an XML data file and its associated schema, then
load up Word and bring that data straight into it. I can work on a Word document and mark it up with XML schema information, then punt it straight out into Excel. I can take some data from Excel and trundle it out onto a Biztalk server for queuing up into a large-scale workflow system. Or I could use the stunning Xdocs application, now renamed 'InfoPath' and use traditional drag-and-drop formatting and design tools to create whatever sort of data entry form, information browsing or data m
anipulation I might want to create.
**Was it worth 24 hours of flying to San Francisco to listen to Paoli for the afternoon, bubbling with enthusiasm about how Office 11 was the first realisation of his lifelong XML dream? Yes, of course it was. When you talk to people of this calibre and understand the enormity of the volte-face that they've persuaded the Microsoft monolith to make, you cannot but help but be intrigued as to what they will come up with next.
*And therein lies another t
ale. You can now take it as read that Microsoft is right up at the bleeding edge of XML work anywhere in the world. Companies like Sun who are battling to make OpenOffice and StarOffice into credible Microsoft Office clones have missed the point. Yet again, Microsoft's opponents have aimed at the wrong target. In the meantime Microsoft has not come up with the answer to 'what is a better Office?' but has chosen to redefine the question.
**Look at the situation now. We have world-class d
evelopment tools for creating web services that sit on servers. These servers and objects can be inside your network, or they can be the other side of the world. We don't care what language you've used, or what platform they're running on - it could be Microsoft's common runtime, it could be Java, or it could be a BBC Micro, it doesn't matter a jot. We've already had one major wave of middle-tier engines in the shape of Biztalk and Transformation Services, and their equivalents from other
companies, and everyone has learned from those experiences, and are pushing ahead with next generation solutions.
**It's intriguing to see how Microsoft has taken a hatchet to its middle-tier engines and ripped them into pieces - Biztalk is no more, nor are the other platforms like Commerce Server and so forth. Everything is now being distilled down into common shared platforms, so there will be one workflow engine, one commerce pipeline, one application transaction server and so forth.
And now we have the front-end tools, in the shape of the Office 11 suite, to deliver a rich and meaningful experience to the desktop. It's now possible to move XML forward onto the desktop and backward to the back-end data stores too, and this is a new encompassing vision that demands serious attention.
*Where's the Store?
*There is however one major component missing, and that is storage. At this point, I have to make a leap of faith and do some speculating. If you think about it hard en
ough, it's a bit daft to store XML in text files called 'mydata.xml' on an NTFS hard disk - what we really need is a 'digital soup' store that just consumes XML data and looks after it. Well, I think its coming, and sooner than you might think. The WinFS (Windows File System) due to arrive with the Longhorn version of desktop Windows next summer is apparently just that, a native XML store. This timing and release pattern makes huge sense - release it on the client side first, and then foll
ow up in the Blackcomb server release a year later, where the data loads will be far higher.
**Why would you want to store information in a digital soup? Well, the reasons are many and all utterly compelling. Consider how you store a file at the moment: Pop a text file called fred.txt into a directory called c:\worddocs. What do you know about this file? Well, you know it contains text by its .TXT extension; you know it's called 'fred'; you know you have put it into that directory. There
are a few other important things you know too - like when you created the file from its creation date stamp, and you know when it was last changed too. There are some security properties associated with the file too, to allow the right people to access its information.
**Now assume that this text file is in XML format, and that it contains the schema for something like an invoice - inside the file you know the invoice number, the number of items, the customer reference number and so for
th, all of which are properties of the information content, and the filename fred is just another of these properties, as is the directory it lives in. Imagine that you had a store that reads and consumes that XML file for you, much the way a database manager would do nowadays. Imagine that you have no fixed directories in this store either, but that these were just 'stored views' or 'catalogues'. Then you could view the information in any way you want, and you could spin it around and vie
w it in different ways for different purposes.
**Let me give an example to explain this. If I knew how to store invoices from the XML schema, then it would be possible to see c:\company name\month\items sold\ and then see all the entries for products sold. So I might see c:\JonsCompany\January\Toothbrushes and then all the line items sold that month. However you might be interested in product first, and thus have c:\items sold\company name\month\ and thus turn this whole view on its head
. We would both be looking at the same data store, but viewing it from different perspectives. Once XML schema and meta-tags are surfaced into the store, this all becomes possible. You can see a very simple implementation of this today by going to an NTFS format store and looking in Explorer at a pile of Office documents. Right-click on the headings and choose some of the other meta-tags which can be exposed. Now take this mindset and expand it up to a wholly generalised customer-defined s
et of schemas.
**Will this all arrive with Longhorn? All the indications, plus various nudges and winks and off-the-record chats suggest that it might, but it's clear that going for a unified information store is the last remaining piece of the XML puzzle which needs to be fulfilled. Hence, to come full circle, back to my opening prediction. Once we take today's middle-tier and application development capabilities and wrap in Office 11 as a first-cut version of XML on the desktop, then f
ortify the rear by changing the storage to a native XML store, the transition to the next computing era will have happened.
**Do you grasp just how fantastically exciting this all is? Of course there will many who just don't see the point, who 'don't get it'. These people will happily stay in the current OS/Office world for some years to come, in just the same way that many stayed with Wordperfect and Lotus 123 well into the GUI era. I think Microsoft knows this clearly enough, but also
knows that they'll be back in a few years once the dust has settled. Being a pioneer is not without its bruises, and looking back over my computing career I can think of times when maybe I pushed the boundaries a bit too far. But that is what I was paid to do and so 'je ne regret rien'. For those of you who shudder at the thought of having to make this XML world work, rest assured that there's still plenty of short-and-medium term future for the status quo. Just don't be upset if some of u
s hitch the horses to the wagons and head west...
*OneNote
*I'm sure Contributing Editor Simon Jones is going to be detailing many of the new Office 11 features in his own column, but I just have to tell you about OneNote. I've installed this onto my Compaq tablet PC, and it's utterly compelling. Such clarity of thinking, such fantastically straightforward problem solving, I'm completely hooked. Go into a meeting, put the tablet on the table and press the record button. Record the entire
meeting and have all of your typings, jottings and scribbles time-stamped. Go back to your desk, sync the information to your desktop and main servers and open the file up there. Click on the little loudspeaker icon next to any item and it plays the sound recording around the time when you made that entry. There's no object model, no macro language, no XML data store (yet). It's a very simple design in this first release, but still a joy to behold, and it comes from the Word group at Micro
soft, clearly showing how they're prepared to 'think out of the box' when it comes to new ways of working with information. I really hope this doesn't get sidelined by inter-department rivalry. In truth, some child of OneNote ought to become the universal canvas and active surface of your desktop, laptop and PDA in the future.
**A Chair for Windows
*I'll admit this sounds a bit silly, but finding a decent office seat is incredibly difficult. The cheap ones seem to be too flat, too hard,
too slippery, or just plain uncomfortable. That ubiquitous Herman Miller 'Aeron' (an Aertex vest of a chair) smacks too much of dot com madness. Years ago Recaro, the company famous for its car seats for road and race applications, did an office seat called the IdealSeat C which was ferociously expensive, but consisted of a top of the line Recaro car seat mounted onto an office base. I didn't buy one then, pleading poverty, but I've regretted it ever since. Good posture is not optional if
you spend hours in front of a computer. So you can imagine my delight at finding that Recaro has started to make them again, and you can order direct from <A HREF="http://www.recarodirect.co.uk" target="_BLANK">www.recarodirect.co.uk</A>. I chose the fabric model called 'Filum' and it's utterly fantastic - supportive, comfortable, and holds me in place as I swerve around the keyboard at high speed trying to get overdue articles written (sorry, Mr Editor!). Treat yourself, your back will loa
ve you forever.
HdHoping his crystal ball is in working order, Jon Honeyball looks at what could be the next Big ThingL June 2003X
Never put off until tomorrow what you can do today - a smug, trite homily that overlooks all those serious reasons that may have stopped you doing it, but one that nevertheless still contains a grain of truth. Here's a case that illustrates it perfectly. A Windows 2000 Domain Controller fails: it's a hardware problem and you need to replace the server with a new one. It was the sole server in this network and backups were, let's say, on the scarce side. As a result, a new Windows 2000 Serv
marketing beta whose purpose isn't so much to find and fix bugs, but to get the world ready for the new suite. The completed product should be released to manufacturing (RTM) at the end of May.
*Most of its parts seemed to work, although there was the usual quota of crashes and 'unexpected' results during some of the demos. There are new features for all of the Office applications - some have been changed more than others. Some wholly new applications have been added to the suite, but Mic
rosoft isn't yet saying which versions of the suite will come with which mix of applications.
*Microsoft has concentrated the changes in Office 2003 to favour what it calls the 'Information Worker', which seems to mean the corporate customer. XML support, SharePoint integration and Information Rights Management are all big themes in this version. InfoPath, one of the two new applications, is squarely aimed at corporate customers and it's difficult to imagine any single PC or home user havi
ng much use for it. OneNote, however, is definitely a personal productivity tool and performs best on a Tablet PC with a pen interface.
*No version of Office seems to come without a complete redesign of all the menus and toolbars - I suppose it keeps the third-party tool vendors on their toes, as they'll have to redesign their menus and toolbar tools to stay looking like Office. If you use the standard Windows XP Theme (Luna-Blue), the menus and toolbars will be a pale to medium blue: tool
bars look slightly convex and icons have been redrawn to take advantage of the greater colour depth available on most modern displays. The icon editor, however, seems to be stuck in 16 colours, but perhaps that will change before RTM.
*TaskPanes are also now much in favour. Most of the applications open with a Getting Started TaskPane that offers help and links to open existing or create new documents. All applications share the new Research TaskPane, in which you can search a variety of
reference sources for articles about any topic. Some of these sources are free, such as Encarta World Dictionary, while others such as E-Library must be paid for. E-Library is currently a free trial, but will cost $14.95 (
9) a month or $79.95 (
50) a year - a heck of a price for a collection of magazine articles. Factiva, another paid-for research library, offers business intelligence from Reuters and Dow Jones, but doesn't seem to have published its price list.
*Microsoft promises conten
t providers with local and international relevance, but I remain sceptical. I reckon there'll be a few free services, some international services and no local services at all by launch day. The paid-for services will probably all require you to have a credit card. This isn't much of a problem for UK users, but elsewhere in the world credit card usage is much less widespread.
*Much of the new integration is with SharePoint Services - a grown-up version of SharePoint Team Services that shipp
ed with some versions of Office XP. The new version of SharePoint will ship as an add-on to Windows Server 2003. It's vastly superior to the previous offering, but does need Windows Server 2003 on which to run. SharePoint is such a significant piece of technology that it deserves a column to itself, so you'll have to wait with bated breath until next month.
*A new feature that works in just about every application in the suite is dubbed IRM (Information Rights Management). This allows you
to protect your documents from being read by unauthorised people. Documents and emails that are flagged as protected get encrypted so they can only be read by users with specified email addresses, and these people can be prevented from printing, forwarding or copying the data in that document or email. The protected document or email may also be set to expire on a certain date, after which only the creator of the document can open it.
*This will all sound very good to companies that want
to stop information leaking, but there are nevertheless some significant caveats. IRM depends on having Windows Server 2003 in your organisation, installing an update to the Windows operating system on every client PC and having each client PC 'authenticated' by Microsoft. Users who don't have Office 2003 can only see IRM-protected content if they download and install a plug-in for Internet Explorer.
*There are as yet no IRM clients for Pocket PC or Smartphone devices, so they can't be u
sed to read IRM-protected documents or emails. If you want to send IRM-protected content outside your organisation, you have to give the recipient access to your IRM server, the one running Windows Server 2003. I'll be looking at this feature in depth in a later column, but I'm sure you'll find much more about it in Jon Honeyball's column.
*Two months ago, I wrote extensively about the support for XML in Office 2003, particularly as it will affect developers. From the end-user perspective,
it will be possible to integrate XML data easily into Word, Excel, Access and FrontPage documents. This XML data may come from Web Services, a database query, an InfoPath form or by any other means.
*Word and Excel can now save files in native XML format rather than the current DOC and XLS formats. They can also have custom XML Schemas applied to a document or template, giving that data more meaning and relevance than just cells or paragraphs. Documents based on templates with a custom s
chema may be saved as pure XML with all the formatting staying in the template. Change the template, and the documents based on it will also change.
*You can apply transformations to the XML documents or suck them straight into a database or BizTalk server. Templates with custom XML Schemas can have custom TaskPanes that help the user complete the document, offering context-sensitive help relevant to wherever the insertion point is in the document. XML support, along with many of the othe
r enhancements, is most applicable to corporate users and for many applications there will be little to tempt single or home users to upgrade.
*Searching the Help text, if you have an Internet connection, will also search Microsoft's new Office Online website, which keeps Help up-to-date and means Microsoft can add new Help to address any problems that come to light after the suite is released. Microsoft has also been busy writing mini-online tutorials to help newcomers to the suite make t
he most of their investment. For instance, one of the tutorials for Excel shows you how Excel deals with time values, how to format them and perform arithmetic with them.
*Word 2003 contains all the enhancements described above and a few more of its own, including Reading view, style lockdown and section protection. Reading view is long overdue: Microsoft has finally realised that many documents are opened and closed again without any changes being made; that is, the user is reading the do
cuments rather than editing them. Reading view shows a single or double-page spread with much less on-screen clutter: margins are reduced, unnecessary toolbars turned off and so on. You still can show the document map, which lists the headings or thumbnails of the pages.
*Reading layout uses ClearType technology to make fonts smoother, and you may increase or decrease the size of the type without changing the font sizes in the document. The layout is very good for reading a document and a
dding comments, but beware that digital ink mark-ups (circling, underlining or drawing arrows pointing to words or phrases) may cause problems. Ink is tied to a paragraph rather than to individual words or characters, so in other views the different margins and word-wrap may change the position of the ink relative to the text.
*You can now lock a document or template so that users are only able to format the document by selecting styles, which stops them from applying arbitrary formatting
of their choosing. You can also protect sections of a document from being changed, or enforce change-tracking. Again, these are good options for corporate users who share documents. The IT department can create document templates to match the corporate identity and the users can't stray from these standards. Some industries have to keep a change-history for certain legal documents and so enforcing change-tracking will suit them very well.
**Excel and PowerPoint
*Excel has probably change
d the least of all the applications in Microsoft Office. Apart from XML Support and an updated UI appearance, there's little obvious benefit for a single or home user. Finding and downloading template workbooks from Microsoft has been made easier and there's a new Lists feature that makes worksheets behave more like a table in a database. The IRM feature and SharePoint integration will benefit corporate users only.
*PowerPoint has had a minimal makeover too, but can now save completed pres
entations to CD-ROM along with an updated viewer - pop the CD into any computer and the presentation starts to play. You can also embed Windows Media Video files into a presentation and have them play full screen.
*The application that has received the most obvious improvements is Outlook. Microsoft says it was aiming for simplicity in this release and that options should now have sensible defaults for those 80 per cent of users who never look inside the Options dialog. Outlook looks radic
ally different from previous incarnations and the changes behind the scenes are legion. On the surface, having the new Reading pane on the right-hand side rather than at the bottom doesn't sound like much of an improvement, but it is. The shorter lines that result are easier to read, you can see more of a message without having to scroll, and the solid grey border and white margin around the text helps your eyes stay focused as you read.
*To accommodate this Reading pane, the list of mess
ages has become thinner and now shows each item on two lines, while the Outlook Bar and Folder List have merged to form the Navigation Bar. This shows icons for the various functions of Outlook such as Mail, Calendar, Tasks and the rest at the bottom - the top section of this Navigation Bar changes according to which function is selected. It took me a while to get accustomed to the Navigation Bar, as I was used to having shortcuts to frequently used folders always accessible in the Outlook
Bar and the full folder list right next to it. Switching between Mail, Calendar, Task and Shortcuts functions on the Navigation Bar seemed cumbersome at first, but now I've been using it for a while I'm finding it more amenable, especially when dealing with multiple calendars or contact folders.
*New to Beta 2 of Outlook is an updated junk mail filter, which has a new filter engine along with lists for trusted senders, trusted recipients and junk mail senders. Microsoft is promising to ke
ep the filter up-to-date, distributing updates as necessary through the Office Update website. The junk mail filter in previous versions was never updated and virtually all junk mail merchants knew what keywords would trigger it and so avoided putting them in their messages. This version still isn't the Holy Grail - that is, a fully Bayesian filter that learns what you think is and isn't spam - but it's certainly better than the previous offering.
*Another new feature I've found useful is
quick flags. You can now apply one of six coloured flags to any mail item with a single click. It's up to you how you use these flags, though it would have been nice to be able to label them with some text rather than just a colour. As before, you can add reminder text and a date/time to get a pop-up reminder later, but you don't have to do this just to get a flag. One of the new search folders constantly looks for flagged items and shows them all together, grouped by flag colour.
*Search
folders themselves are another welcome improvement. Anything you can search for using the Find or Advanced Find functions can be set up as a Search Folder, with the results of that search constantly updated in the background. If you have Exchange Server 2003, these searches can be performed on the server. Out of the box, Outlook comes with search folders for unread messages, flagged messages and large messages. The New Search Folder user interface is easy to use, but you can also use Find
or Advanced Find and save your query as a new search folder.
*If you use Exchange Server, you'll notice some big improvements in connectivity, as you can now set up Outlook to use a cached copy of your Exchange mailbox. This is a single tick-box option rather than the raft of questions you used to have to answer to set up an OST (Offline Storage file) with synchronisation. The cached mailbox gets synchronised automatically whenever Exchange server is available and Outlook doesn't mind if t
he connection to Exchange disappears or reappears, so you no longer have to shut Outlook before unplugging the network connection. The connection to Exchange is now speed sensitive: on a slow link, such as a dial-up connection, all the headers are downloaded first and then the message bodies. You can also set up a connection to Exchange Server 2003 via HTTPS without needing a VPN (Virtual Private Network).
**Small Business Edition
*Destined, I think, for the Small Business Edition of Offi
ce are Publisher and an add-on to Outlook called Business Contact Manager. Publisher has had a good makeover with many new and improved features. For instance, it can now create HTML emails to publicise your business or turn an existing print publication into a collection of pages to be published on your website.
*Business Contact Manager, however, is a miserable thing. It's supposed to be an extension of Outlook contacts that shows the hierarchy of companies and contacts and the history
of all your dealings with each contact. Microsoft claims it's designed for companies with less than 25 employees, yet Business Contact Manager is most definitely a single-user application. It can't use Exchange Public Folders, data is stored in MSDE (Microsoft SQL Server Desktop Engine) and the forms it uses aren't customisable in any way. How many companies of five people, let alone 25, have just a single person responsible for contacts with customers?
*Business Contact Manager won't grow
2 with your business, so it's a complete non-starter. I think either its designers were given a duff brief, or more likely the marketing people have been sticking their oar in again, trying to ensure Office won't compete with Microsoft's own new Great Plains CRM (Customer Relationship Management) software.
HWSimon Jones tests Office 2003 and brings you his verdict of what Microsoft has in storeL June 2003X
104 (June 2003)i
Applications
toolbarG
toolbar's
toolbar/menu-handlin
toolbar2000
toolbar97
toolbardisplay
toolbarsU
toolbarwindow32
toolbookk
toolbook's
toolbook-authored
toolbook-created
ainingD
transactionN
transactionse
transferE
+O+O+O+O+O
+O+O+
+O+O+OO+O+O+O{
O+O+O
O+O+O+O+O+O+O+O+O
O+O+z
+O+O+O+O+OO+O+O+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+OI+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+
O+O+O+O+O
+O+O+O+
+O+O+
+O+O+OO+O+O+O+O+O+O
+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O
O+O+z
+O+O+
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+
O+O+O+O+O+O+{
+O+O+OO+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+z
+O+O+O+O+O+O
+O+O+
+O+O+O+O+OO+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+CP*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+P*P+O+O+O+O+O+O
OO+O+
+O+O+OO+O+O+O+{
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I
+O+O+O+O+O+O
+O+O+
+O+O+O+O+OO+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+P*P+O
doublyT
doubtD
doubters
doubtfulu
doubting
doubtlesss
shortlyA
shorts
shortwave
1I1I1I1I
O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1
I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%UI1I1I1I1I
1I1I1I1I1I
O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%
+O+O+OO+O+O+O+O+O
+O+O+O+O+O+OO+O+O
+O+O+O+O+O+O
I+O+O+O
+O+O+O+O+O+O+OO+O
+O+O+OO+
O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O{
+O+O+
+O+O+OO+O+O+O+O+
+O+O+
+O+O+OO+O
+O+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O
OO+O+O+O+O+O+O+P
O+O+O+O
O+O+O+O+O+O+
Paul LynchD
eBooks in your Palm
One of the best reasons for owning a PDA, beyond the now-boring PIM functions that come with every device, is the ability to view portable multimedia content. For audio, PDAs aren't as good as dedicated MP3 players, but they do work well for reading eBooks. Dedicated eBook readers, however, are one gadget too far. I can juggle a PDA, mobile phone and MP3 player, but it's just as easy to pack a paperback as to bother with an eBook reader.
*Some people can't cope with reading text on a PDA,
er is configured from scratch, DNS installed, DHCP goes in and it's now Friday night so we'll sort the rest out on Monday...
*On Monday, your good intentions are pushed aside as you're plunged into a sea of other unrelated problems, and by lunchtime Friday's incident is all but forgotten. Everyone can log in, has access to the Internet (so DNS must be working) and they're happy. Until, that is, you come to apply some NTFS permissions to a folder and discover that, while you can see the Eff
ective Permissions for local users, Windows flatly refuses to display those details for any domain user, regardless of who they are or their status.
*Something is niggling your mind, but you can't quite put your finger on it. You check a whole slew of parameters, but you still can only view the Effective Permissions for local users.
*After a while it occurs to you to look at the Event Viewer, and there under Application you discover rows of the dreaded white-on-red Xs. You see that the sa
me three errors are repeating, the first of which (Event ID 1030) says:
*'Windows cannot query for the list of Group Policy objects. A message that describes the reason for this was previously logged by the policy engine.'
*Error number two (Event ID 1053) says:
*'Windows cannot determine the user or computer name (Access is denied), Group Policy processing aborted.'
*Error number three (Event ID 1058) says:
*'Windows cannot access the file gpt.ini for ... (The network path was not found.)
Group Policy processing aborted.'
*You could be forgiven for thinking this is a Group Policy issue, and, of course, it is, but its underlying cause is DNS or, more specifically, cached DNS on those client machines that survived the domain controller crashing last week. Everyone was largely unaffected then, because when Windows XP found it couldn't access the domain controller for logon, their cached logon credentials were used and are still being used.
*The confirming test is simple enou
gh: fire up a Command Prompt and enter 'ipconfig'. A glance shows that the IP address for the system is correct, as is the subnet mask, and the default gateway is properly configured. Now you have to bravely type 'ipconfig /flushdns', which clears the DNS cache on the client system - you prove that by typing 'ipconfig /displaydns'.
*You can try to re-register the DNS data by typing 'ipconfig /registerdns', then attempt to examine the Effective Permissions once more. Now the system can't s
ee any domains at all.
*Turn to the server and examine the event logs - it's packed with error messages (Event ID 5513), each saying the same thing about a different network client:
*'The computer COMPUTER NAME tried to connect to the server \\SERVERNAME using the trust relationship established by the DOMAINNAME domain. However, the computer lost the correct security identifier (SID) when the domain was reconfigured. Re-establish the trust relationship.'
*With a sinking feeling, you reali
se the problem's been there since that old domain controller went belly-up, and you know only too well what happens next. In My Computer | Properties | Computer Name, click the Change button and remove the system from the domain, then tell it that it's in a workgroup (whose temporary name is unimportant) and restart the system.
*Now repeat, taking the system out of the temporary workgroup and rejoining it to the domain. You've cured the Effective Permissions problem, your system and the d
omain can see each other again, and everyone's happy. Except you, who now has the unenviable task of explaining to all these users that they're about to get a new SID and local profile, which means they're going to lose their Desktop, their My Documents folder, possibly access to some applications (depending on how they were installed) and plenty more. You're not going to be popular around the office.
*But fear not. I'll show you how to restore those users' Desktops and, while the process
is a little tedious, you'll make a lot of people's lives easier. This isn't going to work for systems with thousands of users, but at that level you'd have had proper backups and probably more than one domain controller, wouldn't you? The key to the solution is that while you're now using a new profile, the old one still exists on the system - you have to get the system to use the old version.
*To see this, take a look in the Documents and Settings folder, where you'll find a folder with
a logon name, plus a new folder with that same logon name appended to the domain name:
*Old profile folder name: fred
*New profile folder name: fred.fredscompany
*You could just copy all the shortcuts, documents and so on from the old profile, but that would lose Desktop settings and personal settings for applications. To use all the old settings with the new profile, you need to re-permission the ntuser.dat (HKEY_CURRENT_USER hive) in the fred profile so that the 'new' fred account can us
e it, while at the same time you want the new account to access the old profile folder to save copying all the documents across. To do this, follow these steps.
*Log on as an Administrator. Note that if the 'new' domain account belongs to the Local Administrators group, you don't need to perform the following steps. This is because local administrators have full control of ntuser.dat, but it's unlikely that a standard user will be an administrator.
*To re-permission ntuser.dat, launch reg
edt32.exe and then highlight HKEY_LOCAL_ MACHINE. From the File menu select Load Hive and browse to the Documents and Settings\fred folder. Select ntuser.dat (if you can't see it, you need to configure the system to view hidden files). You'll be prompted for a name, and as before the temporary name you choose is unimportant. You'll now have a new hive listed under HKEY_LOCAL_MACHINE.
*Select the new hive and go to Edit | Permissions. Remove the SID entry that you'll see there (a string of
numbers following an icon of a user's head, with a red question mark on it indicating that this SID can't be resolved - if it could, you'd see the friendly name of the account holder rather than this numeric code). Now add the new fred account.
*You can now unload the hive via File | Unload Hive. Go back to Documents and Settings, locate the old profile and change its NTFS permissions. Remove the old SID entry and add the 'new' account, making sure it has Full Control and also ensuring th
at the account is given access to all subfolders and files.
*Next, get the system to load the old profile when the 'new' account logs on. Run regedt32.exe again and open up: HKEY_LOCAL_MACHINE\ SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList
*You'll find a list of SIDs (those that end in 500 are administrative, so you can ignore them), and as you click on each one you'll see that it has a value of 'ProfileImagePath' holding the folder to which it points. You'll find that two have
these 'ProfileImagePath' values:
*%SystemDrive%\Documents and Settings\fred
*%SystemDrive%\Documents and Settings\fred.fredscompany<p>
*Simply select the latter, remove the '.fredscompany' part and the 'new' account will now point to the old profile. Close the Registry Editor, restart and log in. The old Desktop appears and, if all's gone well, the applications work and all shortcuts and settings are still in place. Now repeat for each user (ouch!).
*I'd like to finish by thanking Microsof
t PSS, in the guise of Kam Patel from the Windows team in Premier Support, for their invaluable help in turning this tricky situation around. And the ultimate lesson? Install a second domain controller now - having one would have saved all these headaches.
**New data types
*An interesting issue has arisen with the arrival of OneNote, Microsoft's new note-taking application for Tablet PCs (and others). OneNote's voice recordings consume around 8 to 10MB of disk space per hour so, if it cat
ches on, voice will become a primary data type alongside word processing and spreadsheet data. It's therefore worth considering the issues voice recording raises, as they affect networks and servers.
*At first glance, 8MB/hour looks neither here nor there, given that gigabytes of disk appear to be given away free with a burger and fries these days. Nor will the network traffic loads imposed by moving the files around be much trouble. In a traditional 'passive' view of business computing,
these voice files will have no impact at all.
*However, I think this is to miss a golden opportunity. You see, for some time, I've been getting increasingly frustrated by the gap between the potential performance a server offers and the amount of real work it does. Ignoring specific application servers that are run near flat-out doing database or ERP work, let's consider that vast mass of line-of-business file/print servers that spend most of their time doing nothing.
*I recently set a cha
llenge to one corporate client: as part of their software evaluation process, they should evaluate exactly how much processing is being done by their expensive server-side applications. It's just not reasonable, I contend, to have a large application server sitting around doing nothing all the time, but rather a grotesque waste of opportunity by the software vendors who should actually be judged by how much horsepower they can consume. (Obviously, this has to be real work - no-one will by
fooled by an application that runs the equivalent of the system idle thread). I've set them the target of having their processor loadings increase by 10 per cent per annum cumulatively - an extremely hard task and one that I know won't be met.
*The important point is that it's making the company rethink its historical perspective, in which small, fast and lightweight was considered good. In today's climate, I want number crunching, I want steam coming out of the fans, and I want to see tr
ue return on investment from my servers.
*Now, let's go full circle back to OneNote. If voice is going to become much more important in the business context - both from OneNote and from, say, Voiceover IP - I demand that my servers take on the computationally overwhelming task of attempting to do voice-to-text recognition. Just converting a conversation into a text stream would be a good start, and once that has been cracked reasonably well I want big neural network calculation engines to
try to identify different speakers and thus discover the structure of the conversation.
*At the very least, I want to have a textual representation of my voice data available through an iFilter interface so that it can be indexed and, ideally, I'd want to ask the important business question, 'who said what about this, and when?'
*Voice recognition is an extremely difficult problem for a computer. We rely on a huge amount of contextual information: sight, to see who's talking, and stereo fr
om our binaural ear arrangement to differentiate sounds by direction.
*I'm not asking that all of this be built into my Compaq Tablet PC as a free upgrade. I accept that the built-in microphone isn't up to the task and that capturing the direction and position of people speaking will be a huge help. Future meeting room desks ought to have something like Calrec's SoundField microphone (<A HREF="http://www.soundfield.co.uk" target="_BLANK">www.soundfield.co.uk</A>) built into their centre,
so that the full, three-dimensional sound space can be recorded as four channels of digital audio. One of the world's greatest microphones for serious professional recording, the SoundField, is a gloriously over-the-top solution to this problem, but four channels wouldn't increase the data rate to a worrying level, nor would increasing the sampling and bit-rates to something approaching CD quality.
*The same logic applies to video too - why not video tape meetings and stream them onto a s
erver? We're entering an era of unlimited disk storage and spare computing cycles, so we have a whole new set of possibilities for making use of that capability.
**RMS
*Microsoft is introducing a new set of encryption technologies in Windows 2003 Server, due in mid-summer. Called RMS, for Rights Management Services, it allows you to encrypt documents and emails on your network and to have full control over who can read them and what they can do with them. For example, it's possible to wr
ite a document in Word and then save it to disk, and as part of the saving process you can define who has rights to read the document and what they can do with it. So you might state that specific people are able to read the document but not print it, for example, in a manner similar to the basic system used in Adobe's PDF files.
*But RMS goes much further, as you can use it to time-out content that needs to have a specific ageing date, so the document ceases to be viewable after a certai
n date. Obviously, the document's owner is still able to open the document at any time and change the permissions as appropriate. Better still, this policy for ageing files can be controlled by central services, so it's possible to change the default ageing profile for a company from, say, six to four months without having to touch every document, simply by changing the profile on the main servers.
*RMS is a complex system, because it has an impact on email users too. You can send an emai
l to some users and specify that they can't forward it to anyone else for example, thus eliminating the possibility of email being 'accidentally' routed outside the company.
*In its first implementation, which has yet to be fully documented by Microsoft, (though I'm promised a whole swathe of white papers will be available by the time you read this), RMS shows some signs of being a first-release technology. Its interface into Outlook has some clumsy areas, for example, but it does appear
at a first glance to be secure and a grown-up solution to the problem. I'll be reporting back once we get final code both of the server-side components and of Office 11, together with the final version of a document-renderer plug-in that loads into Internet Explorer.
*There are many uses for this sort of technology. Preventing documents from leaking is a tricky, almost intractable problem to solve, and given the rise in popularity of email and Web access - together with laptops going in a
nd out of companies all the time - it's easier than ever for information to leak both intentionally and by mistake.
*I can't help wondering whether Microsoft has created this solution in response to its own notorious leakiness. It will be interesting to watch its own take-up and implementation of this technology over the next year. For organisations that don't have a real and present danger from leakage, I'd probably recommend that you take a waiting position until the dust has settled. I
mplementing a company-wide document-encryption scheme isn't something to be rushed into, especially as a response to an embarrassing leak.
*And there are many question marks hanging over this first release from Microsoft, like how easy is it to do document recovery? Who holds master keys, and are they government agencies like the CIA or FBI? Or the equivalents in our own countries? Just how easy to use will it be, and can we make document encryption the default rather than the exception ba-y ensuring that it's extremely simple to use?
HdDavid Moss attempts to keep his users happy, while Jon Honeyball notes Microsoft's solution to leaksL June 2003X
Opportunities for talented developers entering the multimedia industry have never been better. However, whether you're creating an online portfolio for yourself, a CD business card, a business presentation or a complete e-learning package, the choice of authoring tools can be bewildering.
*This month, I'll look at the range of authoring tools, image and sound editors available, and arrange them into price bands to suit all pockets and purposes. Bear in mind that the prices I quote are, in
just as some can't read documentation from a VDU. It's a skill that can be learned, though, and we're all going to have to do it; I crossed this Rubicon more than ten years ago.
*Books, especially fiction, are different, though. We perceive them very differently from textbooks or manuals. I love buying good books in hardback, on high-quality paper with well-chosen typefaces (even letter press). So getting used to eBooks isn't easy, as displays are still blocky and grainy with poor fonts a
nd formatting. It's possible to adapt, though, and I have. It's rewarding to download an old favourite, or perhaps get hold of a publisher's galley proof of a new title and read it while travelling. It won't match the feel and smell of a good book, but then neither do cheap paperback editions.
*EBooks are a touchy subject, because some high-profile ventures by traditional publishers have failed miserably, although several small publishers have had enough success to survive and thrive. The
y've found a market of dedicated eBook readers, people who've learned how to cope with reading within the restrictions of the PDA format.
*One group of such keen readers are Bible scholars, who have to have online searchable copies of the Good Book, perhaps in various versions. These people have pioneered the format. Baen Books has also made a success of its Webscriptions venture (<A HREF="http://www.webscription.net" target="_BLANK">www.webscription.net</A>). It's a well-known name among
sci-fi readers, a dedicated group that routinely read several books in succession, often while travelling, and have above-average technical skills.
*Possibly the most successful eBook publisher so far used to be called Peanut Press, which had its own reader software with support for both markup and encryption. This was bought by Palm, and now Palm Digital Media produces reader software and eBook- creation software for Pocket PC, Symbian and Palm OS operating systems. Its Palm Reader is bu
ndled free with most Palm OS devices, and a commercial upgrade, Palm Reader Pro, is available cheaply from <A HREF="http://www.palmdigitalmedia.com" target="_BLANK">www.palmdigitalmedia.com</A>. Another good source of eBooks in Palm formats is <A HREF="http://www.memoware.com" target="_BLANK">www.memoware.com</A>, the outcome of several free eBook libraries combined.
*On a Palm OS device, everything is stored in what's effectively a database file, structured for retrieval in a slightly mor
e sophisticated way than the flat files we're used to on previous-generation computers. This additional layer of complexity means that when it comes to storing flat text files on a Palm, it isn't obvious how best to do it.
*Back in the early days of Palm, Rick Bram came up with the idea for the Doc format, which stores any sort of data as a chain of 4KB-sized records, including a title for the document. He also produced a free program called PalmDoc to read this format, and a string of re
ader/writers for PCs and Macs called MakeDoc. Doc was sold to Aportis, which produced a commercial reader, AportisDoc. This unfortunately closed down at the end of 2002, but not before Doc had established itself as the main document format for both Palm and Pocket PC. The fact that the format is portable across platforms, unlike Microsoft's equivalent, makes it attractive to publishers who want to keep a foot in every camp.
*The Doc format has quite a few limitations for eBook publishing.
It needs more effective compression, plus a standard for marking up and formatting texts. EBook publishers also want some form of digital rights control, which is absent.
*Several of the dozens of available readers solve some of these problems by supporting HTML as their markup language, despite the severe fragmentation of the HTML standard and its relative verbosity. So HTML-Doc is now a common format and is supported by some eBook publishers, notably Baen Webscriptions.
*In practice, t
his means a tripartite split among Doc format readers: one set is good for public domain texts with minimal markup; another handles unencrypted documents with HTML markup (say MobiReader from www.mobipocket.com); and the final set contains one member, Palm Reader, for documents with both markup and digital security. These sets hardly intersect, although both Palm Reader and MobiReader can read plain Doc files too. Palm Reader can also cope with HTML-Doc files in a limited way - you get to
see all the HTML codes, which makes heavy weather of a little light reading. This obviously isn't acceptable for comfortable reading, and Palm Reader is the only reader that can read its own proprietary format.
*MobiReader is an odd choice for me to single out, as it has a huge memory footprint (around 400KB compared with Palm Reader's 140KB) and an awkward user interface. There are better readers for HTML-Doc than MobiReader, but it exists for Pocket PC as well as Palm OS, and, more impo
rtantly, is the reader recommended by Baen Webscriptions. MobiReader has its own encryption format, but this only applies to books purchased from MobiPocket, not to freeware or Baen books. It's interesting to note that Baen provides its books in a form that can be downloaded many times by the account holder, in multiple formats with no encryption, whereas both MobiPocket and Palm Reader use encryption to restrict access to books in their formats.
*For Pocket PC and Windows machines only, M
icrosoft supplies its own proprietary format, known by its extension as LIT, which isn't available on Palm OS or any other platform. I've looked at creating LIT eBooks in a previous column, so this time it's the turn of the Palm platform.
*Cooking the eBooks
*Creating an eBook in HTML-Doc format is easy - just download the MobiPocket Publisher from <A HREF="http://www.mobipocket.com" target="_BLANK">www.mobipocket.com</A> and follow the tutorial. In essence, you create a metafile that hold
s all the information for creating the book by combining a number of HTML files.
*ASCII and Word/RTF documents can be converted into the required format by Wizards. An eBook also needs an author, a title, a cover page and a table of contents, and MobiPocket Publisher includes a Wizard to help you create each of these if your HTML is already nicely formatted. It gets a lot harder if you don't have well-formatted HTML. The cover page is a single image, and the table of contents can be autom
atically generated or built by hand. These are both nice to have, and give a professional look to the eBook you create.
*This is a similar process to the Microsoft Reader publication process and is clearly intended for publishers, large and small, for creating eBooks with an option (if you pay money for it) to include encryption and password protection.
*At the other extreme is the (free) personal publisher used by Palm Digital Media called DropBook. You have to create a marked-up text fil
e, but then the process of publication is simply to drop it on the application. It will ask for name details, then a place to save it to, and that's all. It's a bit more laborious to add the markup, although there are macros available to help with Word documents. This is a trade-off when compared with the MobiPocket approach.
*With MobiPocket, the process is polished, but it can take a lot of effort adjusting the HTML and table of contents to get it all to match and make the jigsaw pieces
fit together (something I first discovered when making LIT ebooks for Pocket PC). With DropBook, you have to do all the markup and document structuring yourself, but the publication process becomes trivial. For me, the effort exactly balances out: sometimes I don't want or need a cover page and table of contents, and don't need the extra hassle required to get all the HTML to line up right, and sometimes the Wizards give an almost perfect result first time. Palm Digital Media also has a f
ull-featured publication package aimed, of course, at publishers with a desire for security called Palm eBook Studio.
**International Access
*A reader, David Raggett, wrote in to say that he uses the AT&T international ISP service, <A HREF="http://www.attglobal.net" target="_BLANK">www.attglobal.net</A>, with general satisfaction and good bandwidth in most locations, although he finds coverage to be scarce in the Middle East.
*I've also had a number of comments about the GRIC service thaS
t I mentioned recently. it seems that it changes its connection points in some countries rather frequently, so it pays to update the list of numbers often, as in some areas they've been known to change from week to week.
H?Paul Lynch helps your transition from books to electronic paperL June 2003X
104 (June 2003)i
Mobile Computing
answerA
transitionW
transparency
transparency/translu
traps
travel
travelled
travelling
trawl
treading
treasure
treatment
treats
trendy
trials
tribulations
trickles
tricks
tricky
triesB
trigger-happy_
trigger-happy
honeyball
polishes
pistol
while
david
trilogy
tripled
trips
trouble
troubled
troubleshoot
troublesome
trucks
truly
trustworthy_
truths
trying
tulip
tunnel
turnsC
tutorial
twang
twist
parter
typographic
Steve CassidyD
Sighs of relief
Feedback is a funny thing. If you're a technologist, feedback happens in amplifiers and other types of circuit - it's rapid and has to be held in check. For network consultants and journalists, however, feedback takes ages to arrive and resolve into a coherent picture.
*I frequently get feedback from intensely techie types who want to read a column full of algorithms for relating maximum PCI bus throughput to the speed of your LAN, or who feel that networking is all about fine-tuning the m
most cases, for downloadable versions expressed in US dollars and are therefore subject to change over time. Where appropriate, VAT is included. If you have a specific area of multimedia, or a particular project, in mind, you can narrow down your choices and find the least costly combination.
*Let's start with the most basic form of multimedia, the CD business card, portfolio or media player. For this job, it's possible to buy a software suite capable of professional results for less tha
100. The best authoring tool in the sub-
50 band is Multimedia Builder (<A HREF="http://www.mediachance.com" target="_BLANK">www.mediachance.com</A>), which you can download and try out for free. Registration costs around
30, for which you get a tool with advanced features including custom-shaped windows and even a rudimentary scripting language. Multimedia Builder has been developed pretty much single-handedly by Roman Voska, and its range of features shows where his interests lie.
U SJ
U SJ
U SJ
U SJ
U SJ
Where tools like Opus, Mediator and Director offer a wide range of capabilities, Multimedia Builder concentrates on specific areas of interest to its creator. For example, it's very good for projects such as autorun menus for CD-ROMs, audio CD/MP3 menus, portfolios and business cards, but useless if you're thinking of deploying on the Internet or for creating multimedia presentations. While it includes a scripting language that makes it the perfect launcher for other programs, it doesn't h
ave any sort of timeline, which makes it difficult to co-ordinate graphics and text with a voiceover or music track. On the other hand, it has an excellent special effects generator for creating visualisations similar to those in Media Player, and it's also able to host other applications; for example, you could create a Word or Internet Explorer window in your presentation.
*Multimedia Builder is ideal for creating simple multimedia projects of a few pages, particularly ones that don't i
nclude synchronised sound and visuals, or ones that launch or host external applications.
*This shoestring package needs a sound editor, and GoldWave (<A HREF="http://www.goldwave.com" target="_BLANK">www.goldwave.com</A>) offers the best value for money in this price range. Available online for $40 (
25), GoldWave includes the standard waveform editing view and the crucial ability to read and write MP3 files, as well as WAV and a range of less useful formats. It even includes an equalize
r and noise-reduction filters.
*Finally, you'll need a low-cost, highly featured, image editor. Coincidentally, MediaChance (vendor of Multimedia Builder) also distributes Real-DRAW, which - with its
35 registration fee - competes head on with Macromedia's Fireworks, offering vector and bitmap editing in one package. Believe it or not, this package was also created by Roman Voska.
*Real-DRAW makes it simple to create all the elements of an application interface. In fact, it could teach
Macromedia a few lessons about how to efficiently apply effects such as variable transparency, bevelling and drop shadows. I particularly like its option to output an alpha channel to a separate file. Also, despite boasting a huge range of features, Real-DRAW won't slow your PC to a crawl and zips along on even the most modest of CPUs. As with Multimedia Builder, Real-DRAW's feature range tends to be patchy, but at its low price it deserves a place in any developer's toolkit.
200+
for less than
100, you can kit yourself out with a usable, if basic, authoring package, sound editor and excellent image editor. But what if your ambitions extend beyond e-cards and program launchers? If you're at all serious about general multimedia development, you're going to need to invest in a truly professional authoring tool.
*Opus Presenter (<A HREF="http://www.digitalworkshop.co.uk" target="_BLANK">www.digitalworkshop.co.uk</A>) is the lowest-priced general-purpose tool, coming
in at a shade under
100. Regular readers of this column will know how highly I rate Opus, and Presenter is a replacement for the entry-level Opus Standard. It lacks a scripting language, but is suitable for most purposes, including creating business presentations and undemanding e-learning applications. You can use it to create materials to be delivered online through its proprietary plug-in, or for an extra
50 the Opus Flex enhancement allows you to output in Flash SWF format. (However
, if you want to create Flash movies, buy Flash - there's no substitute.)
*My favourite low-cost audio editor has always been Sound Forge XP, which has now been renamed Sound Forge Studio (<A HREF="http://www.sonicfoundry.com" target="_BLANK">www.sonicfoundry.com</A>). At around
44, this is a cut-down version of the heavyweight Sound Forge, but it lacks nothing that a mere multimedia developer would miss. I've been using Sound Forge XP/Studio ever since a copy came bundled with my early v
ersion of Director and it's now part of the standard kit for all my developers at NlightN.
*Sound Forge Studio is rock solid, easy to use and reads/writes sound formats including MP3 and WAV. It has a huge range of effects and processes, including the very useful Noise Gate, which eliminates samples below a certain volume level, removing hiss between words in a voiceover, for example.
*Opus distributor Digital Workshop also supplies Paint Shop Pro, soon to move to version 8. Originally a
shareware image converter, Paint Shop Pro is now a genuine budget alternative to Photoshop. Version 8 adds a background eraser, which sounds like an enhanced form of the colour replacer tool, plus image-warping tools and an array of new filters and effects including the genuinely useful tiling tool.
*Paint Shop Pro 8 is likely to retail at around
80 (including VAT), which, given its functionality, represents excellent value. The combination of Opus Presenter/ Standard, Sound Forge Studio
and Paint Shop Pro is probably the minimum acceptable software suite for professional multimedia developers, and for around
230 gives you a complete production environment capable of tackling most multimedia tasks.
*Opus Presenter is ideal for creating top-quality presentations, simple games and relatively undemanding e-learning apps, but its lack of a scripting language or extendibility means you can't bend it to your will as you can, for example, Macromedia's Director. The most pocket-
friendly authoring tool with script support is, in fact, Presenter's big brother Opus Pro, which weighs in at
249.
*However, Opus Pro's scripting language, OpusScript, doesn't yet have the ease of use or range of functions needed to make it a true all-rounder. It follows the ECMA guidelines a little too closely, to the extent that it chucks out extremely unhelpful error messages if you forget to capitalise commands properly. Given that its user base is likely to include newcomers to prog
ramming (upgraders from Opus Presenter, for example), OpusScript needs to become easier to understand, more tolerant and more accurately documented before I can recommend it.
1,000+
*The next big step up is to a general-purpose authoring tool capable as the mainstay of a professional multimedia company on a budget. I like MatchWare's Mediator EXP 7, although I think its
649 (exc VAT) price tag is a little on the high side.
>www.matchware.net</A>) has a number of advantages over Opus Pro, including the fact that its script languages (you can choose either VBScript or JavaScript) are already widely used so you're likely either to know it already or will be able to learn it quickly. Mediator is also, crucially, able to host ActiveX controls, which opens up a huge range of possibilities, although this technology is beginning to show its age. I've just added a complex interaction with an online database to a Medi
ator EXP program using the Microsoft Internet Transfer Control very simply, although Mediator's inability to read the parameters of an ActiveX Event is a pain.
*My favourite all-round sound- recording suite is Cool Edit Pro (<A HREF="http://www.syntrillium.com" target="_BLANK">www.syntrillium.com</A>). Available online at just under
160, Cool Edit Pro provides everything you could possibly need in a multitrack recorder and audio editor except for the hardware. My RWC colleague Brian Heywo
od is better qualified than I am to talk about that side of things, but as an example, my own hardware setup is an M-Audio Delta 66 sound card bundled with an Omni I/O audio station. My microphone is an Audio Technica AT4033 condenser in a custom-built-sound booth, and I get excellent results from this setup, certainly good enough for the purposes of commercial voiceover work.
*For manipulating images - resizing, changing colour depth, cropping and so on - I use Paint Shop Pro as above, bu
t most of my image editing is done in the excellent Macromedia Fireworks. I've been using Fireworks since version 1 and it's now my favourite tool for constructing interface elements, screen designs and websites and it's available online for around
190. This brings the cost of my pro-level authoring package to around
1,000, not including hardware - a sizeable investment, but worthwhile because this combination of flexible, highly productive and easy-to-use authoring tools with profession
al sound and image editors allows you to meet the requirements for just about any sort of multimedia production.
*At NlightN I've created everything from database-driven, touchscreen kiosks to complex, multi-CD e-learning products using Mediator EXP. In the case of the kiosk application, we'd originally created it in Director, but the client wasn't happy with the quality of the transitions or the way Director can run sluggishly on even the fastest PC. I managed to recreate almost a month's
Director work in Mediator in just eight days (mind you, they were eight long days).
**The Top end
*Having said that, Director still has a place in our top-end multimedia package. If your line of work involves cross-platform delivery, or if you want to be able to cope with even the most non-standard of projects, Director is your man, as it can realistically be considered a jack-of-all-trades. The key is its ability to host Xtras that extend its functionality, along with its powerful (thou
gh idiosyncratic) Lingo scripting language.
*The natural companion to Director is Fireworks, but my experience is that most Director users prefer Adobe Photoshop. Topping the scales at a stiff
570, Photoshop has an array of features of benefit to multimedia developers but suffers from remaining print-oriented. It's also difficult to get to grips with. However, if your multimedia projects tend to be design-heavy or if you recognise the sense in learning to use the industry-standard image-
editing package, it might be worth investing in Photoshop.
*There isn't a better multitrack sound editor than Cool Edit Pro, so my top-end suite of industry-standard software is complete, weighing in at just under
1,500. And remember that the expense and effort of learning Director and Photoshop does at least give you a highly saleable skill should you be looking to work for someone else.
*Of course, if your interests are purely focused on web development, your choice is simple: get Macr
omedia Studio MX - Flash, Fireworks, Dreamweaver and vector-editing package FreeHand all in one package for around
570. If you're working on a shoestring, you could buy Flash on its own for around
315 online and make do with low-cost or free image and sound editors.
*If you're a full-time student or teacher, you can benefit from big educational discounts on all Macromedia products, although you should be aware that the educational version of Director displays a 'not for commercial use'
splash screen on projectors, which is fair enough.
*Using ActiveX in Mediator
*I've spent a good deal of time over the last month or so creating a multimedia 'shell' for my designers to work on, which takes the form of a Mediator program containing a series of standard objects, template pages and blank pages along with scripts that handle user login, tracking and questioning. One of the reasons Mediator was chosen as the development tool for this major project was its ability to host Activ
eX controls, plus the scripting power needed to get the most out of them. Director also has this ability, something I put to good use when I hosted an Internet Explorer window in my latest Director project and used Lingo to control what was displayed when it refreshed. I was even able to use IE's print engine from within Director to print HTML pages.
*It obviously makes sense to use ActiveX controls that are likely to be already present on your target PCs, including, for example, all the M
icrosoft Forms controls and standard interface components. These ActiveX components allow you to add list boxes, combo boxes and, in my case, a slider control, to your Mediator or Director presentations among many others.
*Here's how to do it in Mediator EXP. Begin by clicking the ActiveX button on the toolbox and Mediator will present a box containing a list of the ActiveXs you've used previously - to add a new ActiveX, click Customize. Find and select Microsoft Slider Control, click OK,
then select it from the list and drag and drop it onto your Mediator page. By default, the slider control has values from 0 to 10, but these can be changed in its Properties box. Begin by changing the name of the ActiveX to 'Slider'.
*Obviously, the point of a slider control is to find out what value the user has selected, and this is simple. Add a standard Mediator Input Text object to the page that you'll use to keep track of the slider's value. Create a new object variable called 'mInp
ut'. I use a convention under which all Mediator-derived variables are prefixed with an 'm' to differentiate them from variables created in script. Now, add a new Script to the page and create a Sub like this:
*Sub SliderEvent()
*mInput=Md8Objects.item("Input")
*mInput.text=cstr(slider.ActiveX.Value)
*End Sub
*The first line after the Sub statement links the object variable mInput to the Mediator Text Input object. The second line changes the text that appears in the input box to the strin
g equivalent of the slider's current value. To tie it all together, right-click the ActiveX control, select Events and choose the ActiveX event. Then add a Script action that runs SliderEvent. Job done! Now, when the application is running, this script action is called every time the user clicks on the slider and the input box is updated with the current value. Simple but powerful.
*Mediator EXP 7 comes with the HTTP Request resource, which claims to allow contact with online databases. Pe
rsonally, I've found problems with it, in that you can't specify what header to send up, meaning that you can't connect to some databases. I resorted to the Microsoft Internet Transfer Control, which is able to connect using a single command like this:
*The Internet Transfer Control has a property called State that changes from 0 to 1 when it has completed the transfer. So, by setting up a timeline to keep
checking that property, you can easily see if it has completed. Of course, you may or may not have actually retrieved any data, so you need to create error-trapping code in case the database hasn't responded or the data you've sent isn't valid. You'll also need to ensure Mediator is given permission to access the Internet by any firewall software installed on the end user's PC. Unfortunately, Mediator doesn't trap the parameters sent back by ActiveX events so it isn't possible to keep trac
k of where in the process of connecting with the database any problem occurred.
*You'll need to know what properties, methods and events each ActiveX supports and, although some properties are automatically displayed in Mediator, many others remain hidden. With Microsoft ActiveX controls, the best place to look is on Microsoft's website. All the information you're likely to need is in the VB 6 documentation at <A HREF="http://msdn.microsoft.com/default.asp" target="_BLANK">http://msdn.micc
rosoft.com/default.asp</A>.
HGKevin Partner guides you through a range of authoring tools and editorsL June 2003X
By now, regular readers should realise that my preferred online search engine is, like the majority of Internet users, Google (<A HREF="http://www.google.co.uk" target="_BLANK">www.google.co.uk</A>). Those paying close attention will also know I mostly use Google UK, as I'm looking for UK-specific information and it saves me having to add +UK to every query. And if I want to go global, it's just a click away.
*Quite apart from its undoubted supremacy in spidering and indexing web content,
ark/space ratio in packing algorithms for Gigabit Ethernet low-layer protocols. This kind of feedback typically takes up 20 pages of A4 and ends with a simple triumphal statement that starts '...so it clearly follows that anyone not up-to-date on IPv6 is simply not qualified to comment!'
*Then there's the other kind of feedback, where someone expresses their relief because the mists of obscurity have been lifted from some topic they've been puzzling over for years. I hope the hard-core tec
hies will forgive me if I say that this latter kind is infinitely more satisfying.
*I often get a 'quiet sigh' of feedback during training courses. On those rare occasions when I produce courses, I set aside a good half-hour to explain drive letters: what they are; where they come from; who decides what they should be; and whatever happened to the B drive ? You'd be surprised at the seniority of the people who express deep and abiding relief when something so utterly simple (to us techies
) is explained in the kind of detail that answers their fearful questions and shows them they're not mad for wondering what happened to the B drive.
*I know it's frustrating to have to baby-talk people this way. You may think that you needn't detain yourself over something as tedious as a meaningless phobia over drive letters, but you must. I got a similar sigh of feedback when I went through the iterations of half-duplex, full-duplex, 10Mb, 100Mb, CAT-5, shielded, unshielded, twisted or
untwisted network connections. People who consider themselves heavy-hitting software gurus emailed me saying: 'I was always sure there was something to that stuff, but I'd never bothered to think it through' and 'Crikey, our network runs smoother now.'
*I'm not just pointlessly congratulating myself here. I have in mind a particular pair of harrowing incidents that broke the smooth progress of my month into a stuttering mess of disconnected site visits, phone calls, project slippages and d
isastrous cock-ups, simply because people have been seduced by the marvellous over-complexity of networks and communications. They've forgotten how to fill in the gaps to make the whole show work, because, first, they haven't listened to feedback and, second, they haven't examined the simplest possible statements one could make about the problem.
*I'm talking about ADSL. To judge from the conversations I hear, those businesses that can are jumping on the ADSL bandwagon at a rate of knots,
recession or no recession. In February, I had a pair of DSL connections go live and neither of them worked correctly. Both failed in the more advanced forms of use and abuse, and both were appallingly served by those 'systems' supposedly put in place for customers to ring up and complain.
*Scenario 1: a BT Ethernet-presented broadband router is installed. This is to be connected to a tiny three-PC network by way of an intermediary hardware router, with a basic firewall (because it will be
a cold day in hell before BT adapts its procedures and standards to accommodate small networks). The router passes the usual tests for operability and says it has logged into either the BT test site or to the third-party ISP's RADIUS servers, but it won't let client PCs browse the Web. Sites can be pinged if you know their numeric IP address, but DNS queries don't get a reply.
*Enter that woeful disaster, call centre technical 'support'. The majority of people asking for technical support
on DSL connections have something badly wrong with their local setup. They trash their computer or fail to stop it looking at DUN when they convert to routed DSL: that is, something goes wrong in the way everything else is put together, except the DSL line. So most call centres will run through a neat little script intended to establish that it isn't their problem.
*Except that, in this case, they couldn't establish anything of the sort. All the machines were running clean, one even being
brand new. This at least shortened the diagnostic turnaround time, as once they'd confirmed that I could ping anything but do nothing else we agreed to wait for another BT engineer to turn up with a replacement box. I packed my bags and let everyone concerned know that they weren't to let the engineer go until they could see the PC Pro website (my standard test).
*Then the phone rang - it was technical support. 'Try it again,' quoth they, 'We think BT misconfigured the router as NAT when w
e wanted non-NAT.' And what do you know, ten seconds turned off, a minute to reload and all was well, except I had 20 minutes to get the whole setup reconfigured.
*NAT versus non-NAT is a simple binary choice that hides many further choices with deep consequences, but it's presented as just two tickboxes to fill in at the start. Yet between a top-flight ISP and BT, a nationwide de facto monopoly, this simple choice had fallen between the cracks and their highly defensive support edifice r
equired more than one crack at the problem before spitting out the simple answer.
*If this were the only problem this month with DSL service provision and networking, I'd write it off as a fluke, but the second scenario was a complete hair-raiser. Nearly a decade ago, everyone was making a fuss about the great plan to deregulate the telecoms industry. The idea was to get away from the dead hand of Busby, BT's ill-starred mascot of the early 1980s, when everything to do with telecoms still
had old Post Office stickers on it, was bolted down with huge waterproof fastenings and secured against largely imaginary lightning strikes.
*Deregulation would lead, so we were told, to competition, and the last part of the deregulatory process would be unbundling the local loop. BT would finally relinquish control over that last bit of copper that runs to your front door, since nothing interesting happens on it and there are plenty of people (most of them ex-BT) who can look after it.
It has taken a long time for this unbundling to turn into anything deliverable for us networking types, but I've finally encountered a completely unbundled setup. However, it was such an unmitigated disaster that had it not been for that other nightmare encounter with BT this week, I'd now be flying Busby's outdated flag and loudly calling for re-nationalisation.
*The problem is simple. Even now, BT controls the equipment it hands out carefully, and that caring has a good side and a bad s
ide.
*It's good to know that any BT-provided business connection will feature basically the same re-badged Flowpoint hardware router; it's good to know that these always present themselves on 192.168.254.254; and it's good to know that you can't officially fiddle with their innards, which keeps lines of communication reasonably clear, so when they break down, the list of people to shout at is short and their obligations formalised by contract.
*As for the downsides: these Flowpoint route
rs boil over at the slightest provocation, so the majority of engineer visits are to replace burned-out ones (it's always the PSU); it's difficult to trust what BT has done inside that opaque configuration; and the flim-flam it employs to deflect reasonable enquiries is terrible. But believe me, I've seen the other side of the deregulated coin in the last few weeks and it's certainly no better.
*In this case, the client had an integrated telecoms provider - a small company that does office
cabling, basic networking and phone systems and was happy to deliver a DSL 'package' as well. I had reservations, but hoped an early test of competence would be to ask for a static IP address range along with the DSL contract and hardware it was pushing. The company accepted the order with this requirement and soon came back with the requested address range. Fair enough, I thought, test passed.
*In went its cabling team, and shortly after I set up the network. The only problem was that th
e DSL line wasn't quite ready, but we worked round the shortfall of this service using a few dial-up lines and waited in good faith.
*Once the engineer had gone, we investigated using one of the promised static external IP addresses for the SonicWALL firewall appliance the client had chosen. What we found was surprising: a single PC detached from our new LAN, set up beside a tiny unbadged router, happily surfing the Web.
*Our new network was on the 192.168 private range, while the router
and detached PC had been set to 10.0.0.1 and 10.0.0.2. There was no sign of the public address range we'd been given. Earlier that week, some blokes had come along with a network card to set up a big photocopier in the office. They'd asked for a two-page questionnaire on the network to be faxed back to them, detailing the IP ranges, gateways, servers, client setup and any other comments, which was in stark contrast to the approach of our newly deregulated DSL service provider.
*So, we cal
led the engineer. We asked why he hadn't bothered to make the router use the external addresses and his reply was, 'Oh, that's easy. Just log in the web page on it and...'. I felt a shiver of premonitory fear. He hadn't done the job because he didn't know how, and he was now suckering me into leaving the router in an irrecoverable state by following his instructions. His mobile faded into a fortuitous reception hole and I despaired of any solution coming from that quarter. Except that, in
the configuration of the router was a web page giving a username and password by which the router authenticates with the actual provider of the high-tech parts of this service, and it had a website, so perhaps this company could help.
*The provider was Bulldog at <A HREF="http://www.bulldogdsl.com" target="_BLANK">www.bulldogdsl.com</A>, and I'm happy to report that its staff were helpfulness personified. I didn't even have to wave the PC Pro stick at them, as every question was succinctly
and sensibly answered.
*They observed that the DSL router supplied by the contractor wasn't on their supported equipment list, but that judging from a quick Google search it might possibly work in the configuration we wanted. They were happy to talk me through it, but agreed that this was speculative at best and a far better solution lay in getting the supplier to deliver what had been agreed.
*I'd been looking over the router while on the phone, and in place of BT's sober 'link' light i
t sported an LED marked 'Showtime'. My faith in this cable-contractor-turned-DSL-vendor sank below zero.
*To cut a long story short, we persuaded the contractors that we wanted a configuration they didn't understand and that they should listen to what Bulldog had to say about compatible equipment instead of buying their routers from a christmas-cracker vendor. Their response has been to withdraw all equipment from the client's site and invite us to achieve the setup without them.
*Deregul
ation may bring many benefits, but I'm sure those presiding over the demise of Busby didn't have in mind this kind of inexcusable sloppiness.
*I hate to think how stuffed I'd have been without Bulldog's pragmatic response and, while I'm happy to identify Bulldog - and its exemplary attitude - in this sorry tale, I won't be identifying the cable and telecoms contractor explicitly. Let's just say that if you're offered a DSL contract by a company whose website is hosted on a free consumer IY
SP such as Freeserve or Virgin.net, consider that it may not have yet got the hang of servicing the business community's networking needs.
HPSteve Cassidy has a bad month with ADSL and briefly considers re-nationalisationL June 2003X
104 (June 2003)i
Networks
you'veA
runtimeA
rupert
rural
ruses
rushB
rushedV
rushingA
russell
russell's
russia
russian
russians
russinovich
rusting
ruthless
ruthlessly
rv-7v
rvatoaddress
rw-only
rwca47
off-the-recordT
off-the-shelfP
off-the-wall
off-white
off97detail
offence_
offences
offend
offended
offender
offender's
offenders
offendings
generously~
oh-so-stylishg
okayF
month
promised
month
on-screen
on-site
on-the-ball
on-the-road@
once-commercial
once-every-two-weeks@
ondisconnection@
feature
music
audio
software
packages
reasons
owning
beyond
now-bori@
bizarre
criticisms
levelled
open-source
one-in-ten@
one-month
one-year
oneclick@
ones@
onextracticon@
onion
only@
only-two-servers-per
onmessagecreated
onto@
ontrack
realise@
reasons@
recommend@
reduced@
registered@
remained@
replies@
require@
rescue@
response@
reverted@
rework@
right@
ritlabs@
rule@
runtime@
sampled@
saying@
scanners@
scary@
scattering@
searchable@
emi-detached@
deregulatoryY
derek
deride
deridedk
deriding
derision
derisory
derivative
derivatives
derive
derived
derives
deriving
dermis
dermot
derogation
derwentside
des-5024
desaturate
desaturatingd
desco
descartesc
descend
descendants
descendants
descendedF
listedV
listenH
listenedY
listener
listener's
listeners
listeningP
listens
listeurofonts
listformath
listindex
passesB
passim
passingG
passion
password-setting
passwordsK
O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
+O+O+O+O+O+O
I+O+O+O+O
z{+O+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+CO
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
+O+O+O+O+O+O
I+O+O+O+O
4O+O+O+O+O+O+OO+O+O+O+O+O+O+O,N,O+O+O+O+O+O+*
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
OO+O+O
O+O+O+O+O+O+O+
+O+O+O+O+O+O+O+O
I+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P+O+
+O+O+OO+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+z
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1I2H2I1I1I1I1I1I1I1I1I1I1I1I1I1$
P+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1
+O+O+O+$
+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O
+O+O+O+O+
O+OO+O+O+{
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+z
+O+O+O+O+O+O+O+O+O+O+OO+O+O+O+O
+O+O+O+O+O+O+O+O
I1I1I1I
O+O+O+O+O+O
O+O+O
O+O+O+O+
O+O+O+O+O+O+O
O+O+O+O+O+O+
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
+O+O+O+O+O+O
I+O+O+O+O
O+O+O+O+O
O+O+O+O+O+O
O+O+O+O+
O+O+O+O+O+O+O
+O+O+O+OO+O+
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+EO
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
+O+O+O+O+O+O
I+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O
+O+O+OO+O+O+
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
and its all-important PageRank and Hypertext Matching Analysis weighting methodology (check out <A HREF="http://www.google.co.uk/press/overview_tech.html" target="_BLANK">www.google.co.uk/press/overview_tech.html</A> for the horse's mouth explanation), Google stays ahead of the game by predicting trends and implementing new search interfaces to service emerging needs.
*Most of the time, it gets it right too, like when it purchased the Deja Usenet Newsgroups archive database and turned it
into Google Groups (<A HREF="http://groups.google.com" target="_BLANK">http://groups.google.com</A>), which, despite the fears of many users, has become even more essential than before. The golden Google touch continued with the introduction of the image search facility (<A HREF="http://images.google.com" target="_BLANK">images.google.com</A>), which now lets you search among a mind-boggling 425 million images on the Web, by keyword naturally. So what, I have to ask, has gone so wrong at G
oogle HQ that it's now wasting precious research and development budget and the time of talented programmers in producing the Google Viewer (which I'll admit is still in demo/beta format)?
*This currently sits at the top of the list of new ideas at Google Labs (<A HREF="http://labs.google.com" target="_BLANK">labs.google.com</a>), which usually means it's the most open to feedback and the most likely to be changed by user criticism. Unlike savvy PC Pro readers, the world at large often do
esn't bother to delve beneath the small square box that is the Google search UI, which is a shame as they miss the chance to exert their influence.
*Described by Google as its 'technology playground', not every brainwave incubated here gets released into the real world, and user feedback via the associated forums can help sway things one way or the other. This is why I implore you to go and play with the Google Viewer for ten minutes, discover just how inefficient a means of search and re
trieval it is and then make your opinion known. I've already put in my ten-cents-worth, of course.
*So what is Google Viewer, and why do I think it's so wrong? It's meant to be an easy-to-use interface into Google searching that previews the found web pages themselves rather than just returning a link. In a broadband age, the idea itself isn't fatally flawed, but unfortunately the user interface most certainly is.
*A good user interface allows easy, intuitive access to an application or
service, but this one does neither. Far too much attention has been devoted to presenting a flashy simulation of the controls of a DVD/Video player, an automatic slide show function and so on, which isn't what I want from a search engine. I want quick access to relevant information - Google has always excelled at this, in my opinion. Sure, you'll get hits that don't meet your criteria, but at least you could quickly glance through the straightforward list of results and mentally filter th
em out. Once you start mucking about with this simplest of presentation schemes, you start messing with my brain, which is a bad idea.
*If you want another horrid example of a search interface gone mad, which totally nullifies the good database parsing behind it, take a look at the Flash-driven confusion that is KartOO (<A HREF="http://www.kartoo.com" target="_BLANK">www.kartoo.com</A>). I just can't be bothered to get to grips with it, and why should I when others present me with the inf
ormation I want without animations, contrived presentation of links, or making the information even harder to find once the results are in?
*Other search engines already implement site-preview features that are much easier to use, which is odd, since the whole point of the Google Viewer experiment is to take searching to an easier-to-understand level. The problem I have is that it takes away far more information than it adds, mainly because of the way our brains are set up to work. I doubt
I'm alone in being able to quickly scan a page of, say, ten search results in a linear fashion, discounting those that are in foreign languages or barking mad and just as quickly spotting those of interest. I do this every day of my life, opening the likely contenders in a new tab using NetCaptor or Opera usually.
*My attempts to work as quickly in the Google Viewer are hampered by the simple fact that no matter how fast your connection and your 'vision/brain/ action' reflexes are, you'r
e still forced to view every page preview in turn, including all the ones you'd instantly discard as not relevant. How is this meant to save time or make life easier? The answer is it doesn't, won't and can't. The website preview idea itself isn't flawed, just the odd way Google seems intent on implementing it. To see a different implementation that is useful, look no further than two search services you may not yet have heard of, but are fast becoming as essential as Google and Copernic A
gent: WiseNut (<A HREF="http://www.wisenut.com" target="_BLANK">www.wisenut.com</A>) and KillerInfo (<A HREF="http://www.killerinfo.com" target="_BLANK">www.killerinfo.com</A>).
*A WiseNut to crack
*WiseNut has been around for a while, going nowhere slowly until LookSmart bought it for a little under $10 million (
6.2m) last year. Now it's one of the four search tools I look to everyday. It incorporates many features found elsewhere into a neat package that delivers the informational goods
in a straightforward manner.
*You start from a Google-inspired blank box and not much else, with results returned quickly, complete with an AltaVista-style semantically related category list at the top in a sideways directory-tree fashion. Results are returned in normal list format, although the arrangement of entries on the page leaves a little to be desired and can be hard on the eyes. However, the ace up the sleeve is revealed when you click the 'sneak-a-peek' link next to any entry,
which quickly opens the referenced web page right there within the list. Yes, the web page window gets elongated and you get scroll bars to move around, but it's live and right there - close the link and it's gone.
*This may not sound that wonderful, but once you've tried it you quickly get hooked into its logic. The only downside is that it does sometimes experience server overload in a way that, to my recollection, Google never has. Perhaps LookSmart is having problems scaling up to mat
ch demand as the Internet grapevine does its stuff.
*KillerInfo has pretty much the same things going for it as WiseNut, although its layout is more logical and therefore easier to use. Instead of sticking the semantic link tree at the top, KillerInfo breaks with search engine tradition by displaying it down the right edge in a vertical directory tree.
*Result listings fall short of Google's standard in terms of relevant information, but they're formatted in an easy-on-the-eye manner. Th
ey also feature a Quick Peek function, which is presented better than its rivals' in that you get a larger and more logical look at the web page in question. Again, this is live and running, so if you run your browser full screen at a decent resolution it's actually possible to browse a simple site without leaving the search engine.
*This brings me full circle, back to my original question - why can't Google provide this ability, which people would actually be able to use, instead of plod
ding on with its fatally flawed approach just for the sake of a flashy interface?
*Not all the current Google Labs demos are so removed from the real world, and one example is the excellent WebQuotes feature, which has been running since December last year. This is one of those simple ideas that has worked well in other sectors, such as Amazon and eBay, based on the notion of user feedback applied to search results. When you perform a search through the Google WebQuotes interface, the res
ults are shown along with related quotes snipped from pages that are linked to the sites in question. The end result is a useful gauge of what each site is about, from a configurable number of sources (I've found that 20 gives you a good enough idea), each with a direct link if you want to investigate the source of the comment more.
*The advantages of such a system need no explanation, but perhaps its dangers do. With the eBay rating systems, only people who've had dealings with - either
buying from or selling to - the member in question can leave a comment and have an impact upon their rating. On Amazon, comments are posted directly to TPTB, who therefore have the ability - many would say the responsibility - to remove potentially libellous or misleading postings. You're probably ahead of me already - in the proposed Google WebQuotes system, all the comments are snippets of opinion posted to completely independent sites outside of Google's control and effectively republis
hed within the results list your search generates.
*This isn't a big deal, unless yours is the site being libelled, but that itself is something of a narrow view - if the only linked comments to some site happen to be negative and false, how could you know? Most intelligent folk can spot an obviously spiteful comment, but if there are no balancing positive comments shown it can be quite hard to reveal the truth. This has only happened to me in a handful of cases, so in day-to-day use you
should be able to trust WebQuotes by using your common sense. If a site is as bad as its critics would have it, it won't take long to realise they were right.
*Another of the services being highlighted in Google Labs is the Google Glossary. Again, this isn't unique in providing a one-stop reference source for word definitions, acronym explanations and technical terminology clarification, but it does a good job. The search results take the form of definitions from various websites, not just
the usual educational resource suspects, which makes a pleasant change. There are also links to related phrases should you wish to explore further, and direct links to both dictionary.com and Merriam-Webster if you want the 'usual suspects' definitions.
*However, there are other sites that do a much better job, and I've already mentioned one of them - KillerInfo. This has recently added a reference channel, which starts with a dictionary definition of the keyword searched, before present
ing a list of definitions from various sites together with that useful Quick Peek feature. Most useful is that you also get the same directory-tree indexing as a sidebar, which makes finding exactly the right reference very quick and easy.
*KillerInfo also has some other 'channels', as it calls its specialised search filters, that maintain the same results structure but for the fields of business or news, for example. These work well, but while I'm in a critical mood it's a shame you can'
t do a reference search and, when selecting one of the alternate channels, be moved to the results on that same search. Instead, you have to change channel and start again from scratch. It doesn't take that long, but it's long enough to prevent me from doing so on a whim (and sometimes whim can be a useful research tool).
**Open-Source Searching
*After all this talk of interfaces and usability, I can't help but have a laugh at my own expense, seeing as I've been making rather a lot of a l
ittle toolbar utility I recently stumbled across while doing a regular hunt around the depths of SourceForge.net (<A HREF="http://www.sourceforge.net" target="_BLANK">www.sourceforge.net</A>). If you've never paid a visit to this Aladdin's Cave of the Web, I can only assume you have absolutely no interest in open-source software development (in which case, you must be barmy).
*I don't mean you're mad, unless you're a fully bearded, sandal-wearing programmer type with a penguin fetish, but
rather that open source is the obvious way forward for the online world. And a community such as this is home to some of the most innovative developers and, therefore logically, to the most innovative and useful applications. Even if you treat it just as a depository for useful software, it's worth a visit - you won't be disappointed.
*I'll admit it does help if you're not too tied to the Windows approach to user interfaces, and even more if you're happy to roll up your sleeves and get y
our hands dirty with some coding. So it was that I stumbled upon a classic 'does what it says on the tin' named Dave's Quick Search Deskbar (<A HREF="http://sourceforge.net/projects/dqsd" target="_BLANK">sourceforge.net/projects/dqsd</A>). This small (380KB download) utility is for people who are so impatient when it comes to searching that even firing up a browser and typing a search string into the Google toolbar is far too time-consuming. Download it, install it, open it up as a toolbar
on your XP taskbar and you're pretty much ready to go. Just type a keyword string, hit Enter and a bog-standard Google search pops up in your default browser window.
*Of course, that isn't enough for me, but luckily this is one application that has plenty of hidden power. If you want to take advantage of the Google 'I Feel Lucky' function and go directly to the top-ranked result site, just add a '!' to the end of your keyword. If you want a Wayback Machine search instead, prefix the keyw
ord with 'archi' and that does the trick. Looking for something on Slashdot? No problem, just prefix with 'sd' and so on. Heck, it can even do you a hexadecimal base conversion, generate roman numerals or set off a countdown timer with alarm right out of its toolbar box.
*I have to admit that in the land of the open-source application, things can get out of hand pretty easily. On the other hand, you can grasp any sort of functionality you want. Even if it isn't there by default, open sour
ce means you can add it yourself. With Dave's Quick Search utility, this means generating the correct HTML form into well-formed XML. It isn't particularly difficult if you know the basics of both, but can be time-consuming nonetheless.
*This is where another little open-source utility (downloadable from the same URL as the Quick Search Deskbar) going by the name of DQSD Search Wizard comes in handy. Go to wherever you want to add a search facility, drop the cursor into the search query b
ox, click on the Wizard icon from the browser toolbar (you might need to add this from the Customise Toolbar function within IE) and up pops the Wizard form. Enter a search abbreviation and any other information you feel the need for and another click will perform the conversion into XML in Notepad, ready for you to use when adding that search to the deskbar. Sometimes this works straight off the bat, sometimes you need to do a bit of massaging, but it sure beats doing it all from scratch,
`5 even if you do have a beard and open-toed shoes...
HZDavey Winder visits Google's technology playground and isn't impressed by all that he seesL June 2003X
104 (June 2003)i
Online
Tom ArahD
Extending Illustrator
The ability to extend Adobe Photoshop's functionality through the use of plug-ins is well recognised and most designers will have at least a couple of third-party filter sets in their armoury. Adobe Illustrator offers similar extensibility, but its vector-based plug-ins are hardly known, with none of the brand recognition of KPT or Eye Candy 4000. I decided to find out what's out there by trying ten Illustrator 10-compatible add-ons. Where possible, trial versions are included on this mont
*Illustrator extensions don't have to break the bank. Visit the Sapphire Innovations site (<A HREF="http://www.sapphire-innovations.com" target="_BLANK">www.sapphire-innovations.com</A>) and you'll find a host of filters, brushes and shapes, all the work of Andrew Buckle. The range is impressive, but the downside is you can easily feel lost. Dig a little, however, and you'll find Andrew's 11 sets of Illustrator add-ons, costing from just
8 up to
40. Finding out what they al
l do is more difficult, as some have broad themes, while others are mixed bags. Bewilderment might continue even after you've installed your add-ons - Andrew clearly prefers to spend his time on developing new functions rather than documentation or streamlining.
*Take the very handy Symbol Paint tool (volume 8), which paints with symbol instances - its dialog lets you adjust no less than 30 parameters. Some of these can be intimidating but are definitely worthwhile. The mysterious Hit Typ
e/Request parameter, for example, enables you to restrict your painting to selected objects or paths. Other hidden gems include the Random Clutter, Random Text and Perspective Grid tools, and at
60 for the entire collection it would be churlish to complain. Having said that, most users would probably prefer a little less control and a little more usability.
*An excellent example of a budget add-on that's both simple and effective is Hot Door's MultiPage ($49 -
30 - from <A HREF="http://w
ww.hotdoor.com" target="_BLANK">www.hotdoor.com</A>). This offers a simple floating palette in which you add new pages and select the page on which you want to work. You can print any page selection or export it to PDF using new File menu commands.
*Multiple page handling must top most Illustrator users' wish-lists, and you're forced to wonder whether MultiPage is too good to be true - is it just a trick? In a way it is. Behind the scenes, it works its magic simply by treating layers as p
ages. Whenever you create a new 'page', a new layer is added - with the relevant guide layer acting as master page - and all other layers/pages automatically hidden. This becomes obvious if you add a layer in the Layers palette, because a new page appears in the MultiPage palette too, but if you stick with sublayers you'll be fine. Let's hope Adobe sees sense with Illustrator 11 and finally scraps its artificial and maddening single-page limit. Until then, MultiPage provides a very handy w
orkaround.
*Equally simple and efficient is Vertigo's 3D PopArt 2 add-on ($59 -
37 - from <A HREF="http://www.vertigo 3d.com" target="_BLANK">www.vertigo 3d.com</A>), which transforms any selected path into a shaded 3D object. Again, it's a simple floating palette that lets you define the extrusion depth, rotate your object in X, Y and Z dimensions and set a lighting angle. As you change settings, a small preview cube is updated correspondingly; when you hit Apply, the effect is applied t
o the current path. It's nowhere near state-of-the-art compared with CorelDRAW's and FreeHand MX's extrusion capabilities, but it's all you'll need to produce the occasional 3D shape or heading.
**High-end products
*The add-ons so far all fall into the category of occasional one-off use - with prices to match - but other developers are targeting niche markets with more advanced and professional functionality. For example, Artlandia SymmetryWorks 2 ($165 -
102 - from <A HREF="http://www.
artlandia.com" target="_BLANK">www.artlandia.com</A>) is aimed at users for whom pattern making is a central part of their work. Its basic principle is simple: select a path or paths, click on any of the 17 coloured icons in the SymmetryWorks palette, and multiple copies of your object(s) are rotated, glided and reflected to produce a symmetric pattern.
*The beauty of this system is that patterns remain live so you can keep adding X or Y tiles to fill the space required. More importantly,
you're able to edit the original 'seed' pattern, for example, by resizing, repositioning, recolouring or adding a new element, and the overall pattern gets updated. You may also add editable cloned 'replicas' of your seed and save these, together with the transformations that you apply to them, as 'layouts'. Working in this way, it's possible to control step and sliding effects, gradations and so on and produce virtually any kind of repeating pattern. There are lots of power features for
advanced users, but I wish Artlandia also offered a cut-down 'lite' version, as its basic symmetry effects offer plenty of creative power.
*Another important niche is the production of perspective drawings. The simplest and most powerful solution is Perspective, again from Brendon Cheves and the developers at Hot Door ($179 -
111 - from <A HREF="http://www.hotdoor.com" target="_BLANK">www.hotdoor.com</A>). It comes as several dedicated drawing tools, including the most obvious cube and cy
linder options, plus three palettes.
*In the first palette, Perspective Document, you choose the type of dimensional drawing you want to produce from isometric, oblique and perspective options. In the second, Perspective Grid, you define the spacing and colour of your grid. The third palette, Perspective Objects, is most impressive, showing a small thumbnail of a cube. By selecting one of its faces and then hitting the Update command, you can automatically project your currently selected
object(s) onto the appropriate plane, creating a true perspective image at a stroke. Most impressive of all, this perspectivised artwork remains live, so if you change the grid settings - say, from isometric to oblique, or back to head-on for editing - the artwork automatically updates.
*Hot Door also has a solution for producing technical drawings. CADtools 2.1 ($199 -
124 - from <A HREF="http://www.hotdoor.com" target="_BLANK">www.hotdoor.com</A>) isn't as eye-catching as Perspective, b
ut it offers four comprehensive setup and control palettes and a range of 38 drawing and dimensioning tools. Highlights include automatic callouts with dimension and label styles, numeric input and control, customisable document scales, and scaled move, transform and repeat capabilities.
*One of the most common tasks for Illustrator users is tracing scanned bitmaps to produce truly scalable and pin-sharp vectorised versions, for example, if you're involved in exhibitions or producing large
-format signs. This is where LogoSpruce ($199 -
124 - from <A HREF="http://www.comnet-network.co.jp/eng" target="_BLANK">www.comnet-network.co.jp/eng</A>) comes in, offering a range of tools for tracing imported bitmaps. I expected these to offer automatic image tracing, but that's not what LogoSpruce is about; for example, the Line Trace tool is designed to let you, wait for it, draw straight lines!
*After my initial horror, I gradually came to appreciate what LogoSpruce does. To start
with, there's a whole range of tools to assist the manual tracing process, including adding one-click guides and the automatic straightening of imported bitmaps. The tracing tools themselves include a number of useful and innovative options like adding perfectly smooth curves at a tangent to the current line. Even LogoSpruce's line tool is far more powerful than Illustrator's own, as it automatically identifies endpoints, midpoints and tangent lines, lets you swap between tool variations a
nd zoom in and out on your image using keyboard shortcuts. Finally, there's another set of tools designed to correct your artwork, including a Trim tool that automatically deletes a line back to the nearest intersection. All told, LogoSpruce offers a comprehensive and well-thought through range of functions for reproducing raster artwork as clean vectors.
*If you'd still prefer a tool that does the work for you, there's another plug-in that fits the bill. The Silhouette plug-in ($244 -
2 - from <A HREF="http://www.silhouetteonline.com" target="_BLANK">www.silhouetteonline.com</A>) is built on the same tracing engine as the well-respected standalone Silhouette application (only available for the Mac). Select the new Vectorize Raster Image tool and use the Silhouette palette to set tolerances, choose black-and-white or colour and, if the latter, whether you want to trap coloured objects so they overlap slightly. Once you've chosen your settings, click on the bitmap and a s
econd or so later the vectorised version appears automatically.
*The problem with such automatic tracing solutions, as LogoSpruce has realised, is that it can take longer to clean up the results than to reproduce the artwork from scratch. Silhouette comes into its own here with a range of dedicated correction tools for automatically aligning control points, sharpening corners and smoothing and simplifying paths. The Silhouette palette even offers a large Preview window where you can see w
hat effect your tool settings will have before you apply them.
**Standalone?
*Silhouette's unique combination of automatic tracing and dedicated correction is impressive, but you have to ask yourself whether you wouldn't be better off with a standalone solution such as Adobe Streamline. The same could be asked of other professional plug-ins aimed at niche markets. For perspective work, wouldn't you be better off using Macromedia FreeHand? For CAD, wouldn't you be better off with Corel DES
IGNER? There's certainly a lot of sense in asking the question, and if you're not already an Illustrator user, the answer is a definite yes.
*If you are an existing user, though, there are plenty of advantages to adding the extra power right there in your preferred drawing environment (and one which offers such power, creativity and integration). And while these plug-in developers might not be household names, I was surprised at just how professional and workable their products are. But i
n the end, the unbeatable benefit of these small developers is that they know exactly how Illustrator is used in real-world production environments and how to make the most of it.
*In between these budget one-off plug-ins and high-end targeted solutions, there lie two stand-out collections that manage to provide such wide-ranging new functionality and creativity that they are, in effect, third-party upgrades for Illustrator. The first is Vector Studio 2 ($129 -
80 - from <A HREF="http://w
ww.virtualmirror.com" target="_BLANK">www.virtualmirror.com</A>). I was a big fan of the first release, which introduced a number of features such as interactive Morph Brushes, Envelope Meshes and the non-destructive application of bitmap filters. Adobe was clearly impressed too, as it bought Virtual Mirror's distortion-based tools and implemented its own envelope and filter handling.
*After Illustrator had incorporated the main highlights of version 1, version 2 was never going to be qui
te such a 'must-have', but it's still remarkable. Two palettes for improving Illustrator's existing gradient handling and for adding a whole new style of gradient-based texture have survived from the first release, along with the ability to apply effects as lenses. There are also new tools to improve shape editing and simplify paths, plus the ability to arrange and save palette layouts.
*More eye-catching are the new Retouch brushes, which let you change the brightness, hue and saturation
of objects and gradient mesh nodes simply by painting over them. Even more powerful and innovative are the new Sampler tools, where you can change an object's colour, transparency and even size, based on sampled object settings, gradients and even imported bitmaps. It takes a while to get your head around just what these Sampler tools are able to do - for example, cloning an imported bitmap as appropriately coloured and sized halftone dots - but their creative power is undeniable (and far
enough from the mainstream that Adobe is unlikely ever to incorporate its own version).
*One plug-in collection that I hadn't come across before is FILTERiT4.1 ($129 -
81 - from <A HREF="http://www.cvalley.com" target="_BLANK">www.cvalley.com</A>). Again, the range and the depth of features on offer stand out. In FILTERiT, these are divided into three main sections: new tools for Illustrator's toolbar, new commands for the Filter menu and new 'live effects' controlled from their own pale
ttes.
*Looking at the Wave, Warp, Roughen, Lens, Broom and Craft (pinch and punch effects) tools, it's clear that like Vector Studio FILTERiT also pioneered interactive distortion effects for Illustrator, although now Illustrator offers its own versions their benefit is obviously much reduced. However, FILTERiT has cleverly given its tools a new lease of life with a 'trace' capability that produces multiple copies of the object in between its original and filtered states. After releasing
these to layers, they can produce eye-catching Flash-based animations. Other new tools include AlignPoints options that let you quickly straighten lines and align objects horizontally and vertically, and the Trail tool, which leaves copies of the current object along the path that you drag - again very useful for Flash animations.
*My favourite tool is the MetaBrush, which works rather like Sapphire Innovations' Symbol Paint tool but with any object or objects. Select your objects, then t
he brush, and begin painting with them. For greater control, you can set the space, size, angle and orientation of the added objects and decide how these settings should vary based on pressure, direction, number of steps and so on.
*FILTERiT's two added Filter menu commands show just how much power may be packed into an add-on. The first, Fractalize, adds new path segments to the paths of selected objects, of a shape, number, height and variation that you select. You can apply the same pro
cess to the added segments themselves, producing self-similar fractal-style effects. As with many FILTERiT tools, the settings can be saved as presets and the Fractalize command can also be applied non-destructively from the Effects menu.
*More powerful still is the 3D Transform filter, which lets you transform objects to a specified 3D shape. This acts like a mini-application in its own right, built around a large preview window. By dragging in the preview, you can rotate an object aroun
d the X and Y axes and, by Shift-dragging, around the Z axis. You may also specify a 3D rotation both before and after the transformation is applied. The real power is seen when you select from the 12 types of transformation on offer - simple rotation, twist, sphere, cylinder and so on - each with a host of controls. Whether you're producing a simple perspective change or an eye-catching 3D spiral, the power is all here (the only obvious omission being realistic shading).
*What gives the F
ILTERiT collection its final edge is its range of 13 effects, each controlled from its own palette. The effects are broadly split into two categories. The first group - Border, Cutout, Emboss, Neon and Shadow - is concerned with controlling the appearance of the selected object's path and works by duplicating, recolouring, blending and offsetting objects. Such effects are relatively common nowadays and Illustrator offers its own shadow and glow effects, but FILTERiT provides greater contro
l and keeps its effects vector-only.
*The second range - Circle, Explosion, Frame, Galaxy, Generation, Reflections, Tiling and Trail - is also built around duplicating the selected object, but this time using the duplicates to produce patterns. These effects act almost as the cut-down version of SymmetryWorks I was wishing for earlier, and - as with SymmetryWorks - their great strength is that they remain live so you can always change settings. For example, you could change the offset and
shadow of the Border effect's bevel or add copies and change the spread of the Explosion effect.
*FILTERiT stands out from the crowd, but in many ways it's typical of all these third-party plug-ins, from Sapphire Innovations to Silhouette. In particular, its combination of tools, commands and live effects shows the unique benefit that Illustrator offers to its third-party developers - complete integration. By comparison, Photoshop plug-ins seem semi-detached at best.
*Having said this, F
ILTERiT is also typical in that its effects can be demanding to get to grips with and aren't suitable for occasional users. In fact, all these Illustrator plug-ins, with the honourable exceptions of MultiPage and PopArt, tend toward the technical and can be difficult to master. Once you've mastered them, though, each plug-in richly rewards the effort by seriously boosting productivity and creativity. Illustrator's plug-ins demonstrate that you can teach an old dog new tricks, and do it suc
cessfully.
HHTom Arah looks at the full range of add-ons available for Illustrator 10L June 2003X
104 (June 2003)i
Publishing/graphics
Mark Newton and Paul OckendenD .NET gain
The latest beta version of Visual Studio .NET landed on Mark's desk the other week and, after some initial troubles installing it - plus huge amounts of hard disk space being consumed - a new and shiny developer's tool appeared. A lot has been said about .NET and whether it brings any advantage for the website builder, so we thought we'd take a web application builder's view of it this month. Since Visual Studio is a programming environment, we decided it would be fun to try and implement
extending
illustrator[
extensions
fabric
fails
faith
fallout
rarely
answered
farewelli
farewell
fredi
fatigue
faxing
faxing
surfing
fearful
fearful
symmetry
quality
fervor
fettling
fibre
fiddler
fiddler
fight
fighting
fighting
cybercrime
files
filtern
filter
tipsn
filtering
filtering
finding
finding
grail
firewall
fishy
fishy
goings-on
fixes
fixes
updates
flair
flash
flash
flash
attack
flash
rescue
flashing
flashing
ahead
flexible
flexible
images
wisdom
online
onward@
parlez-vous
internet@
perfecting
plus@
prove@
rarely@
knowledge
search@
semantic
service@
sighs@
small
console-ation
sorry
sound@
support@
tears@
arena@
three-way
fight@
tooling@
u-turn@
ventured@
webcam
watchdog
fcuk@
you're
being
0-1003-201-7239473-0@
0.04@
00:40:8c@
020170431500@
1&mary@
1.01f@
1.2.1@
1.75in@
1.95@
10-2@
10.0.0.50
10/100/1000@
100/10baset@
100000
1000000000
100hz@
101527.113
1024@
1024/1200
10ft@
10mbits/sec
11.3@
18gb@
192.168.254.2@
1971@
layers/pages[
layingF
layman'sS
layoffs
layoutH
layoutkind
layoutsH
leadY
extrusion[
extrusions
exttextout
eye'sd
eye-candy
eye-catching[
eye-opener
eye-openers
eye-opening
eye-popper
eye-popping
eye-safe
eye-watering
eball-to-eyeball
eyeballing
eyeballsF
eyebrow
eyebrows
+O+O+O+O
O+O+O+
+O+O+O+OO+O+O+O+O+O
+O+O+O+O
OO+O+O+O+O+O+O
+O+O+O+O
I+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O
I+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+OO+O+
+O+O+OO
O+O+O+O+O+O+O+O
O+O+O+
+O+O+
+O+O+OO
+O+O+O+O+O+O+O
+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+{z
O+O+O+O
O+O+O+O+O+O+O
+O+O+OO+O+O+
+O+O+OO
O+O+O+O+O+O+O+O
O+O+O
+O+OO
+O+O+O+O+OO+O+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+OO
O+O+O+O+O+O+O+O
O+O+O
+O+OO
+O+O+O+O+O+O+O+O
+O+O+O+O
I+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O
I+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+O+O
O+O+O+O+O+O+O+O
OB+O+O
O+O+O+z
O+O+O+O+O+O+O+O
OO+O+O+O+O+O+O+P*
P+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O+O+O+
+O+O+OO
O+O+O+O
rat's
ratchet
rateE
rate-adaptive
ravingL
raynsford
razzmatazz
rdbmse
rdp-tcp
acts[
acts/acts2000/200000
acts1998/19980029
+O+O+O+O
O+O+O+z
O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+
O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1
O+O+O+z
O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O
I+O+O
+O+O+O+O+O+O+OO+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+z
O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O
+O+O+O+O+{
+O+O+OO+O
actedL
acticex
actingD
actinic's
actiniccatalog
actionA
33ffff
33ff33
3333ff
333333
wwwwww
UUUUUU
DDDDDD
""""""
O+O+O+O
O+O+O+O
sensibility
sensibly
sensing@
sentence
sentences
separate@
separated@
separations
sequencing@
serial/usb@
serif@
serious
seriously@
served@
server
server's
server-centric@
server/metaframe@
serverless
servername/share@
servers
serversource
servervariables
service@
service's@
service-packed
serviceable
servicedetail@
services
services/sites/defau@
session's@
-click@
sir-ee@
site's@
situ@
six-column@
skilfully@
skills@
slave@
slim@
slotting@
slurp-tocc/si@
small
changes
large
unanticipated
consequences
smallest@
sme/soho@
snatchers@
sniffer@
farewell
served
reworkH
reworked}
reworking
reworks
rewraps
rewritable
rewriteC
rewrites
rewriting
rewrittenK
rewrote
rfc-editor
rfc1305
rfc1738G
rfc1766
rfc2030
rfc3028
rfc822
rfc822-formatted
rfc868
rfi/magnetic
rgb-based
rgb-getavailabledriv
ridiculously]
riding
Jon HoneyballD
Taking the tablets
I've finally bowed to the inevitable and purchased a Tablet PC, as I knew I eventually would. I say this in such a sad, resigned way because I believe most laptop manufacturers are currently peddling tired, unimaginative products that are neither one thing nor the other. On one side, they attempt to be portable, but are much too heavy and clumsy to be something you carry around all the time; on the other, they attempt to be a desktop, but the limitations on keyboard and mouse quality soon
a traditional Visual Basic program, but as a browser-based .NET Web Form application.
*Mark had to do something like this in 1997 for CityScan, but back in those days the browsers were all version 3 with no DHTML, no CSS and there were still major differences between IE and Netscape. Mark had considerable 'fun' coding this up using a site with nine framed areas and hand-rolled JavaScript to provide all the functionality. It's a current project of Mark's to revisit this web application and
implement it using Web Forms and Web Services, but for now we'll look at what can be done to implement a simpler application as a Web Form. This application displays records from a database table into a data grid - fairly standard stuff.
*The first two screenshots show the Windows application written in Visual Basic 6 and the web application written as a Web Form with Visual Studio .NET 2003 beta. As you'll see, they're remarkably similar, the main differences lying in the tab and data g
rid controls, which under .NET aren't as refined as the Visual Basic ones, though we're sure they'll be improved given time. The thing that struck us most about building this Web Form was that you write it just as you would write a VB application - that is, you open a form and drag objects onto it, position them as you want and then set their properties to 'wire' them up together. There's little initial coding required in the early stages of UI development when designing either a Visual Ba
sic application or a .NET Web Form. Gone are all those battles with using HTML tables for positioning or hand-coding of scripts for validation, and along comes full debugging and drag-and-drop design. The code you write is in the language of your choice, with full hinting to help with its syntax.
*As for the components that you use, while many are supplied, there are even more being produced out there by third-party software vendors for you to buy. The Web Form that we built connects a dat
a grid to a SQL server database and, as you select the tabs along its top, the data grid gets repopulated with the corresponding records from the database. All the components we used came bundled with Visual Studio, although we couldn't find a multiline tab control that was available as a demo and had to make up the tabs along the top using buttons, hence the slightly strange look of this part of the UI. Given time, we're sure that better tab controls for .NET will appear.
*It was interes
ting that while programming we noticed the absence of certain properties from otherwise familiar controls - there are areas of difference between VB and VB .NET that will continually trip you up at first. It's important to realise that you can't just take your implementation in VB and rewrite it in VB .NET without first considering some of the implications. If you use databases and data controls, one of the biggest differences is that .NET uses 'disconnected datasets'.
*To understand what
this involves, consider a 'normal' Windows application that talks to a database. This database is either part of the application or may be somewhere out on the network if the application in question is of the client-server type. In both cases, the application is permanently connected to the database and quite reasonably doesn't expect it to just 'disappear'. However, with a Web Form that's connecting to a database over the Internet, it isn't unreasonable to expect the connection to slow d
own or even break. It's here that disconnected datasets come in handy. The dataset inside the Web Form connects to the external data source using a .fill command, following which all the other data controls talk to this local dataset and not directly to the external data source, except when an update is required. Next, the dataset sends its copy of the data to the external data source and refreshes this copy of the data once the updates have been committed. This means the performance of th
e Web Form application isn't heavily dependent on the speed of the Internet connection, and the application can happily handle being offline.
*An interesting point is that while writing this Web Form, not a single word of HTML was written. All the design and programming was done within Visual Studio and VB and then the application was 'built' and run. If you click Show Source in your browser when the web application is running, you'll see the HTML that's generating the Web Form and can not
e that it looks fairly familiar, full of HTML tags and snippets of JScript. Just to see how tightly this solution would be tied to Microsoft's own browser, the Web Form was also tried out in Netscape 6.1 and, although there were some changes to the look of certain controls, the core functionality still worked.
*Once you've written and tested your web application, you'll need to deploy it onto a web server and configure this to run the application. Again Visual Studio helps, as you can crea
te a web setup project that will install all the necessary files and configure any web server settings. You can, if you wish, test for the presence of the .NET Framework and install this on the server if it doesn't have it, although because this is some 23MB in size it's probably better to install it manually. It only needs to be done once and then your web application files will all become a more manageable size.
**FrontPage 11 and XSLT
*In previous articles, we've talked about Web Serv
ices and 'consuming' the data from them. This data comes in XML format and, for it to be usable, you'd either use a web proxy or an XML template known as an XSLT. Web proxies are - in the case of Visual Studio - built using a command-line script or - in the case of Macromedia's Dreamweaver MX - on the fly as you connect to a Web Service. However, XSLTs offer a simple way to format XML data and, being simple text files, you can use any text editor to produce them. This is all very well, but
it puts designing XSLTs currently on a par with coding HTML in Notepad: there's no graphical tool allowing drag-and-drop of data fields in the XLM data, for applying styles and generating an XSLT from the result. Until now that is - by the time you read this, the embargo will be off and we can talk about Microsoft's Office 2003.
*The product in the Office suite in which we're most interested is FrontPage. In the past, we haven't been too kind to FrontPage as a web design tool, although a
s a design-and-configuration tool for a lot of other Microsoft technologies like SharePoint it's often the product of choice. FrontPage now has another claim to fame in that it is - so Microsoft says - the world's first fully graphical XSLT editor, and as such we're sure it will be welcomed with open arms by those of us who have to design such things. Next month we'll be looking at FrontPage 2003 in more depth to see what other treats are in store.
**Slow burn
*If you had to choose a sing
le word to define the way we currently live, the word that sums up modern times, that word would possibly be 'fast'. Everything is fast these days: fast food; fast cars; fast lifestyles. Our kids can't cope with 'slow', have the attention span of gnats and many of them can't even survive to the end of an MTV video. Even our comedy is served up in 30-second chunks, courtesy of the Fast Show. And let's not forget our Internet connections.
*At their office, most professional web designers wil
l probably be sitting at the end of a big fat MegaStream pipe plumbed directly into Telehouse (the communications heart of the UK's Internet infrastructure), while at home they'll almost certainly have some kind of ADSL or cable modem connection, again just a couple of 'hops' and a few milliseconds away from Telehouse. Those of us who are thwarted by both BT and the cable companies and can't get broadband at home will have our houses on the market in search of a Luddite buyer, and will be
actively looking to move somewhere a bit more 'wired'. What was once a pipe-dream has now become a necessity, and many of us simply couldn't do our jobs if we weren't sitting on the end of some form of high-speed window onto the outside world.
*There's one problem, though. As most of us have only had these high-speed connections for a short time, not only have we become totally dependent on them but we've also forgotten what it was like to struggle along with a 56K modem waiting for pages
to load. Hmm... 56K modems, now there's a case for the Trades Description Act. As Jim Royal might say, '56K my arse!' Did anyone ever see a 56K modem actually connect at 56K? But that's all in the past. We have broadband now and we've become totally complacent.
*And the result of this complacency? A web full of monsters. Oh, sure there are some beautiful monsters, finely detailed monsters, exquisitely designed monsters, even. But, nevertheless, monsters. We're building huge sites that are
no longer usable by people using slow connections. Our designers are creating 'heavier' sites - pages composed of layer-upon-layer of graphics, big background images, massive Flash movies, scads of preloaded animations and let's not forget sound. Every web page now seems to want to either sing or talk to you, and each mouse click will be accompanied by a wonderful musical fanfare. Wonderful, that is if you have a high-spec PC with the latest Logic Pro THX Super Dolby Digital Surround speak
er system. To the masses out there, with 'these came with the PC' speakers, the glorious fanfare will usually be transformed into a pathetic farting noise.
*Our programmers and HTML 'code monkeys' are no better than our designers. They too are building for broadband only. Where once they'd write code that pages through data ten records at a time, these days it's more likely to be a hundred. Where once the tags on a page would be optimised by hand to speed up page loading, today you'll find
superfluous tables-within-tables-within-tables, enough white space to fill a small book and great swathes of 'last year's code' surrounded by comment tags.
*Don't be too hard on these designers and programmers though, as it's easy to see how such complacency arose. For those involved in building the sites, their pages will still load very quickly both at work and at home. When you only ever access the Internet via a fast connection, and when all of your friends have broadband too, it's so
easy to forget that there are still people out there with slow modems. There are people who can't get broadband; people who don't want broadband; people from countries with poor connectivity; people who can't afford
25-30 per month. In our rush to launch new sites, we tend to forget about these people.
*It's crazy when you think about it: we'll spend ages making sure that a new site works with IE 3, IE 4 and some horrendous old version of Netscape, but then we'll completely forget to te
st it using a 33.6K modem. The reason is obvious - most of us no longer have dial-up accounts to test sites from and don't even have modems. Can you remember the last time you heard that screeching noise as a dial-up connection was made? If the answer is no, you're almost certainly one of the people we're talking about.
**Getting sloppy
*There is, you'll be pleased to hear, a solution to this problem, which comes in the form of a Java Web Start application that goes by the name of Sloppy.
What it does is simple enough - it deliberately slows down data transfer between the client and the server. You could be sitting in Telehouse or connected directly to a LINX (London INternet eXchange) switch port, but it wouldn't matter - running a site through Sloppy means you'll see it just as a modem user would see it.
*Sloppy was written by Richard Dallaway and you'll find it on his website at <A HREF="http://www.dallaway.com/sloppy" target="_BLANK">www.dallaway.com/sloppy</A>. If you
already have Java Web Start installed on your machine, you can just download Sloppy and run it. If not, you'll need to pay a visit to <A HREF="http://java.sun.com/products/javawebstart/index.html" target="_BLANK">java.sun.com/products/javawebstart/index.html</A> and install Java Web Start. The latest version is shipped as part of the Java 2 platform, Standard Edition (J2SE) and we had no problem running Sloppy using that. Earlier releases of Java Web Start are available in standalone form
, but it's probably safest to stick with the most recent release. When you get to the download page, just select the JRE version for your platform (Windows, Linux or Solaris) - you don't need the SDK to run Sloppy.
*Once you have Web Start installed, simply click the 'Launch Sloppy' link on Richard's site, which takes the form of a JNLP (Java Network Launching Protocol) file. You might get a dialog asking what you'd like to do with this file - if so, select the option to open/run it and th
e Java Web Start environment you've installed on the machine will know what to do with it. This download will install Sloppy and register it within the Web Start environment. You'll probably see a warning that Sloppy is requesting access to your machine, but just click the Start button and a few moments later you'll see Sloppy's small user interface.
*As you'll see from the screenshots, the Sloppy user interface is simple: you just type in the URL of the site, select the modem speed, and ]Othen hit Go. Sloppy belongs in the 'essentials' toolbox of every web developer.
H<Mark Newton gets a new tool, while Paul Ockenden gets sloppyL June 2003X
104 (June 2003)i
Web Business
usefulA
disappearedJ
disappearingL
disappearsL
disappoint
disappointedD
disappointingS
disappointingly
disappointmentH
disaster-recoveryL
sayingA
sayings
sb1394
~ffff
ffffff
ffff33
ff33ff
ff3333
editionsB
editorM
educationc
education-friendly
snatchersP
sneaky
sniffR
snoopers
socket
softwarec
software-based
solaris
solid
solutionV
solutions
solve
solves
solving
someB
ed-up
sourceu
sources
sowed
space
spacesd
spam-blitzer
spam-filteringc
spammer
sparked
speak
speaks
O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
put a stop to that.
*Until now, my ideal 'transportable desktop' laptop has been the Apple PowerBook Titanium, and with the recent launch of the 17in screen version - complete with DVD cutter, wireless and every sort of port known to mankind - it will be hard to resist upgrading.
*I've already tried every PDA on the market and found them all wanting: too small, too fiddly, too limited, too slow and some have battery lives that are ridiculously short. And don't get me started on the current
wave of smartphones - they're truly horrible confections. What I wanted was something like an electronic pad of A4 paper that I could carry around with me all the time, and I think I've found it in the HP/Compaq Tablet PC.
*This machine has a keyboard that detaches, and happily there's no sign of it trying to be one of those weird 'everything to everyone' designs that's basically just a conventional laptop with a rotating screen. Last summer, I was fortunate enough to get hold of an Acer
prototype from the Microsoft workshop, but after a month I was glad to give it back. It was a victim of its own indecisiveness and ended up being only a moderately effective laptop and a thoroughly awkward Tablet PC. The Compaq is different and is firmly aimed at being carried around all the time. The keyboard works well enough for those times when you want a keyboard, but for me it will be used almost never, because the on-arm and on-desk operation of the Compaq is so good.
*I did worry
that the lack of pressure sensitivity on the pen would become an issue, but I resolved this by reminding myself that I'm like a clumsy three-year-old when it comes to graphical design, and that adding pressure sensitivity is unlikely to turn me into Leonardo da Vinci. The memory as delivered at 256MB is a little stingy, but this is because the Crusoe processor takes some 40MB for its own use - a quick
30 256MB upgrade fixed that anyway. Some might find the Crusoe a bit on the slow side, b
ut I don't agree. In a truly portable device, you want to have just enough power to do your work and nothing more if the battery life is to be at all useful.
*Now I'm really looking forward to the next beta release of Office 11, which comes with the OneNote application mentioned in previous columns. Being able to voice record meetings and integrate the data with my doodlings will be a major productivity breakthrough for me.
**jDiskReport
*For years, I've been looking for a straightforwa
rd application that shows how much disk space has been consumed on a drive, and how it's split between the various directories. Most of the applications I've come across so far have been overly complex or try to do 30 other things at the same time. However, I've finally found my nirvana in an applet called JDiskReport, a download from <A HREF="http://www.jgoodies.com" target="_BLANK">www.jgoodies.com</A>. This is a site set up to showcase the Java programming capabilities of its author Kar
sten Lentzsch, who lives in Germany. There's a Web Start installation of JDiskReport for those who run Windows, or full instructions on how to start the Java package for those on almost any other platform.
*The utility is brilliantly simple in operation and its user interface is uncluttered. I can click on any part of any of the graphs and drill down to get the detailed information. Performance is excellent and it just does the job. Best of all, it's free and won't time-expire. Go to Lent
zsch's website and check it out.
**OWA Yes!
*For many companies, the ability to get into your corporate email system from anywhere in the world, irrespective of the computer you're using at the time, is a mission-critical function. It even has relevance right down to the small business user, who wants to be able to check email from home but certainly doesn't want to get into the complexity of bouncing office emails to a home-accessible mailbox or something equally messy. For this reason,
Exchange Server's Outlook Web Access (OWA) becomes a killer feature for those deploying Exchange. In OWA, you can see the clear benefit of having a centralised mail store and not allowing email to go down to the desktop itself. By keeping it stored centrally, you can access it from any number of machines running normal Outlook, for example.
*But you also get the OWA capability. To refresh your memory, OWA is a set of Web Services in Exchange Server that tightly couple to the IIS web serv
er on the same box. The ASP code allows you to log into your mailbox and have everything presented to you with full editing and reading capabilities.
*I can go into a web caf
on the other side of the planet and get into my email, which makes this a potent application. I've even used it in places where I suspected the traffic might be being snooped - a web caf
in Rio de Janeiro in December, for example, where several dozen kids were playing networked Quake and a surly owner might well ha
ve run a packet sniffer on the Internet connection.
*To ensure the security of my information, I made sure I logged into my web server using HTTPS encryption. In past columns, I've explained how easy it is to get Windows 2000 Server to generate its own self-referential SSL certificates for just this purpose.
*With the forthcoming Titanium release, otherwise known as Exchange Server 2003 (ES2003), Microsoft has put a lot of effort into building up OWA and ensuring that it offers an even r
icher experience. There were many holes in the ES2000 version and lots of areas that could have been improved. However, ES2003 now delivers with a vengeance. Basically, it looks and operates just like Outlook 11, which is in the next release of Office. There's the same three-pane view, the drag-and-drop capabilities, and there's a richness that will blow away users used to ES2000 OWA. You can even get into a window that allows you to modify your server-side rules. Sorting out client- and s
erver-side rules in Outlook itself can be a pain, but with this new OWA window you get straight to the server side.
*Are there any downsides? Well, such a rich client experience takes a fair bit of downloading, and you might find it a little sluggish on a slow Internet connection, at least for initial loading. Also, the full richness only works on IE 6, which means Windows clients. The experience on 'lesser' browsers is still good and certainly an improvement on previous versions, but it
falls a long way short of the Windows/IE 6 experience.
*For many people, the new facilities in OWA might well tip them into upgrading from ES2000 to ES2003. They'll get the other benefits too, which will be mostly relevant if they upgrade to the new Outlook at the same time. But given that ES2003 will run on Windows Server 2000 and doesn't require Windows Server 2003, it's an incremental upgrade that can be done at your own speed.
**Slow Login Again
*Yet again I've come across this issu
e, so it's time to nail it once and for all. The scenario is quite simple. You have a Windows NT 4 server network with a bunch of XP client machines. It's decided that you'll upgrade to Windows 2000 Server over a weekend, and the on-site local IT junior goes pale at the thought. So a self-styled consultant is retained to mutter the necessary mantras over the server one weekend. The server upgrade seemed to go fine, so the consultant goes home and explains to his wife that it was a hard day
at the office (inserting a CD-ROM and pressing Enter a few times).
*On Monday morning, all hell breaks loose in the office. The users of the desktop machines find it takes forever and a day to log into the shiny new server ('forever' in this context meaning up to 15 minutes or so).
*Finally, each machine does log in successfully, so work can start. At this point, the company hauls the consultant back in to fix the problem. There's much scratching of collective heads, with the consultant
swearing blind that the server is fine and the local on-site IT junior saying that nothing has been changed on the desktops.
*After a few days, someone like me gets called in, which makes me feel like the chap in the circus ring who follows the elephant with a bucket and spade. You don't know what things have been tweaked in the meantime to attempt to get the server to be stable, and Windows 2000 has a sufficiently large number of available adjustments that spotting something unhelpful c
an be hard.
*However, this one turns out to be easy to fix. Go to a desktop XP machine, open up the TCP/IP properties and there you'll find the address of the DNS server, and I'll bet you dollars to doughnuts that it's the IP address of the DNS server at the company's ISP, somewhere down at the other end of their ADSL, Kilostream or ISDN line.
*The IT junior put this address in because that's what the ISP told him to do, and it worked just fine for web browsing. Let's not worry for a mom
ent that every DNS query was being sent up the line, or that while this was happening the firewall was probably left wide open to allow each desktop machine to contact the outside world. So, you change the DNS server entry from the ISP's to the IP address of the Windows 2000 Server, which coincidentally is running DNS services too. Log out, log back in and marvel how it now takes no time at all.
*The obvious question is, 'what's going on here?' Well, you have to remember that Windows XP i
s Active Directory (AD) aware, so it knows it's on an AD network because it's joined to one. When you try to log in, XP attempts to find a local AD server to authenticate your request, and this process of location is done via - yes, you guessed it - the DNS server running on the Windows 2000 box, which knows all about where the AD boxes and services are located (these are the Resource Records in the AD DNS whose names start with underscores).
*If the Windows client is pointing to the ISP'
s DNS, it will ask the ISP's DNS where Mama is and then, not surprisingly (and in a wholly Wizard of Oz sort of way), it won't have any idea how to get home. The desktop machine says 'please, please, please' and the server says 'nope, I won't tell you'. This pathetic little psychodrama is played out for about 10 to 15 minutes until finally Dorothy the Desktop gives up and screams a local broadcast around the local network. The internal DNS server and AD box pipe up with a 'here I am' and t
he desktop finally logs in.
*I don't know how much more clearly I can say this: DNS is critical to the operation of Windows 2000 and AD. It's used for just about everything, and on the client side you must point your DNS queries to your own AD/DNS box for resolution. By 'must', I don't mean 'it's a good idea', I mean 'there's no alternative'. And don't screw things up by putting in secondary DNS server entries that point to the ISP's DNS server. I've seen that in several installations, wh
ere the AD/DNS has been accidentally set up to believe it's root authoritative and that the rest of the Internet doesn't exist. In desperation, the IT admins have put the ISP's DNS server address as the secondary DNS server in order to get some sort of external resolution. Can you imagine the mess this causes?
**Get Wet
*I recently needed to set up an Internet connection in a room that had no network ports. I didn't want to drape Ethernet cables around the door frame and halfway down the
hallway to a nearby network socket, but unfortunately none of the client computers had wireless networking capability. A solution was found in the excellent Linksys WET11 Wireless Ethernet Bridge, which I discovered a few months ago. I purchased a couple ready for just such an event.
*The WET11 consists of a small box, not much bigger than a paperback, which has an external power supply unit and an Ethernet port, and a large aerial that you screw onto a small plug. The easiest way to con
figure it is to attach a machine to its Ethernet port and run the software supplied on a CD-ROM, which will walk you through the setup process and takes only a few seconds. After a reboot of the box, it should be up and running. From here, you can just point a web browser at the IP address of the Linksys box and remotely manage it via HTTP.
*Once I'd confirmed that I had a wireless connection to the nearest Apple AirPort base station, I plugged a 16-port switch into the Ethernet port of t
he Linksys and, voil
, I had an instant desktop network access point in a room that had no Ethernet.
*A wireless Ethernet bridge like this has so many uses: you can use it around the house to get networking into a room that has no wired access, or you could use it to place an Ethernet-connected printer somewhere more useful. Obviously, its forte is working with devices that have no wireless capability of their own, though I'll confess I'm tempted to try plugging an AirPort base into the L
inksys to see if this coupling will allow me to build a wireless extension bridge and thus get Internet connectivity in some remoter parts of my office.
**Wiebetech adaptors
*A couple of months ago, I mentioned the WiebeTech hard disk to FireWire adaptors, which enable you to mount a bare IDE hard disk onto the adaptor and then hot-plug it onto a FireWire bus. They come in two main sizes: one for desktop hard disks and one for laptops. I ordered one of each and they've just arrived. They
're nicely built, with machined metal cases and quality connectors, and they just work. Plug in an IDE hard disk of unknown parentage, pop it onto a FireWire bus and instant plug-and-play - the disk is mounted.
*If you run a hardware repair shop, or are an IT department that gets hard disks in machines that have broken down, these adaptors will be worth their weight in gold. They're not expensive and, despite HM Government's best attempts at liberating a further
50 from me for import dut
^Ly and so forth, I reckon they're an invaluable tool to have in your toolbox.
HaResistance proves to be futile for Jon Honeyball as he delves into his pockets to buy a Tablet PCL
May 2003X
103 (May 2003)i
Advanced Windows
Simon JonesD
Invoices in Excel
I recently got an email from Peter Andrews, who has been helping his local garage by doing some programming in return for servicing. Peter has been developing an Excel template for invoices, and everything was going along fine, even the bit about MOT tests not being VATable. However, he ran into trouble when trying to make the invoice number increment automatically for each invoice:
*'I feel I could now design an invoice template from scratch in Excel, but I don't know how to write a macro
Here's some harsh, cruel but worthwhile advice: if your company was hit by the recent 'Slammer' SQL Server worm and you're a board-level member of your company, I advise you to fire your IT director now. If you don't have an IT director but you're nominally in charge of IT, fire your IT manager now. Also, please identify who's responsible for the company firewalls and fire them as well. You might think this is a little harsh, maybe a little hasty. It's not. Those companies that got infecte
in the template, which will look for another workbook (for example, invoice number.xls), find a number in cell A1, increment it by 1, save invoice number.xls, then write the incremented number into a specified cell in the opened template.'
*It's funny how things come around, as I covered this problem in my first column for PC Pro six years ago (issue 30), so perhaps it's time to revisit and expand on this topic. The basic answer to Peter's question is that when you open an Excel workbook
, whether an existing or new one, Excel fires up the Workbook_Open event. If a new workbook is based on a template, you can write code in the Workbook_Open event of the template and this will be run whenever a workbook is created from that template or you open an existing workbook that was previously created from that template.
*Peter wants to assign the next invoice number to a newly created invoice, but doesn't want to overwrite an existing invoice number whenever the user reopens a sav
ed invoice. You can use the fact that a newly created invoice won't already have an invoice number to avoid overwriting any existing invoice number - only run the code if there isn't an invoice number there already.
*One of the easiest ways to identify which cell contains what data is to set up 'named ranges', which gives a meaningful name to a cell or group of cells, so you can refer to 'InvoiceNo' instead of 'M3'. To create a named range, highlight the cell or cells, type the name and pr
ess Return in the name box on the far left-hand side of the formula bar. For this piece of code, I created named ranges called InvoiceNo and Date.
*The range method will return a cell or collection of cells. You can identify which cells to return using the cell addresses, such as 'A2', 'B3:D4', or by specifying a named range. So, Range ("InvoiceNo") will return the cell I just named as InvoiceNo. If this cell's Text property is empty, it doesn't already have an invoice number so you'll ne
ed to get one.
*To open a workbook, you must call the Open method of the Workbooks collection, passing the name of the file to open. To make this work, you have to create a new workbook, put the next invoice number into a cell, name that cell 'NextInvNo' using Named Ranges as before, and save the workbook, remembering the exact name and path to the file. That name and path need to be put in the code in the call to Workbooks.Open. In the workbook you open, you can find the named range 'Next
InvNo', increment it and remember the value, then you need to save and close the workbook. Now put the number into the InvoiceNo cell on your new invoice. A similar, but rather easier, bit of code can check for the existence of an Invoice Date and put today's date in if there isn't one already (see Code Listing One).
*There are several things to notice about this code. While it does what Peter asked, it may not be exactly what he needs. What happens if you open the template file itself and
make a change to it? The next invoice number will get incremented and put into the template itself, and now the template won't work because the code tells it to do nothing if the invoice number and date field are already filled in.
*You have to remove the invoice number and date from the template before you save it. You also need to open the next invoiceno.xls workbook, decrement the number stored there and save that workbook to avoid having an invoice number missing from the sequence. Y
ou get a similar problem if you start a new invoice and abandon it, hence 'wasting' an invoice number. This missing number problem is a consequence of assigning and incrementing the number on creation of the invoice. On the other hand, assigning the date to the invoice on creation is okay, as it's likely the new invoice will be dated today, and where this isn't the case you can type the new date before saving and printing.
*This brings us to the question of where to save completed invoices
and what to call them. If you don't provide any code for this, the user will have to point to the right folder and manually name the files when they're first saved, which may lead to invoices being stored in different folders and using different naming conventions. It would be better to set the folder path and name in code.
*So, if you move the call to the GetInvoiceNumber procedure from the Workbook_Open event and put it in a new SaveAndPrintInvoice procedure, you can get the next invoi
ce number, print the invoice and save it all at the same time. You may also ensure the invoice is saved in a predetermined folder with a consistent name, which includes the invoice number (see Code Listing Two).
*You can now make a toolbar button for the SaveAndPrintInvoice routine, which the user may click when they've filled in all the other details. But what if a user tries to print the invoice using the ordinary Print button on the toolbar or the File | Print... command on the menu? Th
e invoice won't be given a number and it won't be saved.
*You can solve this by making use of the Workbook_BeforePrint event. If you put your code to get the invoice number and save the workbook here, it won't matter how the invoice is printed - it will always get a number and be saved. Similarly, if the user tries to save the workbook without printing it, the invoice won't have a number and may be saved with the wrong name in the wrong folder. The Workbook_BeforeSave event is a good plac
e to put code to deal with naming and saving workbooks (see Code Listing Two).
*Unfortunately, there are some nasty gotchas waiting to catch you out. There's no way to set the name or path of a workbook without calling the SaveAs method, but if you call Workbook. SaveAs from inside the BeforeSave event, you immediately re-enter the BeforeSave event and can end up going round in circles. You need to leave yourself a marker to say that you've already been here and stop trying to interfere,
which is accomplished by using a Static variable; that is, a variable visible only within the procedure, and which keeps its value from one call of the procedure to the next. You could use a module level or global variable, but that would be using a wider scope than is necessary and good programming practice dictates you always use the minimum scope necessary for any variable, constant or procedure (because it reduces the chance that variable names will clash or that variables will get cha
nged when they shouldn't by some other procedure).
*So, in the Workbook_Before Save event, after you've determined that you're running in an actual invoice and not the template, check your booAlreadyWorking flag. If you're not already working, set the flag to say you are now, then you can get the next invoice number and call the ActiveWorkbook.SaveAs method to save the invoice with the correct name. This will immediately call the Workbook_BeforeSave event again and you're back at the top o
f the procedure. Check again that you're not in the template - this time you're already working, so skip over the middle bit and the procedure ends.
*Now execution resumes at the line after the call to ActiveWorkbook.Save. You know the workbook is already saved, as you just called the SaveAs method and you've just been through the BeforeSave event. You also know Excel will try to save the workbook again when this instance of the BeforeSave event finishes and you don't want to save the wor
kbook twice, so set the Cancel flag to True. You can then cancel the flag that says you're already working and end the procedure.
*That's the meat of the program code, but what about the design of the invoice? There are going to be formulae and descriptive text all over the invoice and you don't want your end user changing these, intentionally or by accident. The best way to ensure that is to mark only those cells the user is supposed to type into as 'unlocked' and then protect the whole
worksheet.
*To make it obvious that the user can or should enter data into a cell, I colour it pale yellow as well as unlocking it. You can click and drag to highlight a range of cells and then hold the Ctrl key while clicking and dragging to highlight another: keep doing this until you've selected all the areas in which the user may type. Press <Ctrl-1> to open the Format Cells dialog and from the Patterns tab, select the pale yellow colour. This will show up nicely on screen, but is too
pale to print on black and white printers.
*On the Protection tab, untick the Locked checkbox and then press OK. Now choose Tools | Protection | Protect Sheet..., tick the top box and, in the list below, tick only the box that says users can select unlocked cells. If you use a password, you may need to put it in the code if your macros change the values or formatting of any cells that are locked. See the bottom of the GetInvoiceNumber procedure in Code Listing One.
*Protecting a sheet do
esn't stop the user from creating new sheets or deleting ones that are already there. For that, you have to protect the workbook structure by going to Tools | Protection | Protect workbook... 'Protect Workbook windows', the other option on the Protect Workbook dialog, is too drastic, as users should usually be allowed to move or resize windows.
*I'm sure the rules about Value Added Tax (VAT) could fill several library shelves, but the main ones are contained in The VAT Guide, Notice 700 fr
om Her Majesty's Customs and Excise. If you produce or receive invoices for a company, you should check section 6 about VAT invoices, where it defines what must be included on a VAT invoice. You have to, for instance, show the VAT rate or rates applicable and either the VAT charged for each line or a summary of the goods values and VAT charged for each VAT rate or both. You must also show your company's VAT Registration number.
*Calculating the VAT for each line is a little difficult, as
you have to use the VLookup function to select the appropriate VAT rate from a code letter. You must also remember to round all calculations to avoid penny discrepancies in the totals at the bottom. Producing an analysis of the goods values and VAT amounts for each VAT rate is a little easier, as it only involves the SumIf function.
*The last word has to be on macro security. As this template contains macros, the code should be signed by the developer to avoid triggering security warnings
every time you create a new invoice. You can get round this requirement by putting the template into the user or workgroup templates folder and ticking the box in Excel's Security dialog Tools | Macro | Security..., that says to trust all installed add-ins and templates, but I can't recommend that course. Why should you trust something just because it's held in a particular folder? Sign your code, then your users know it comes from you and hasn't been tampered with.
**Template Wizard
ck in the original article in issue 30, I wrote about using the Template Wizard to create a database of invoice data. This would be a separate workbook containing invoice number, date, goods amount, VAT amount, total and so on. This Wizard, although useful, was dropped from Excel 2002 (Office XP) onward, but it's available as a download from <A HREF="http://office.microsoft.com/Downloads/2002/tmplwiz.aspx" target="_BLANK">http://office.microsoft.com/Downloads/2002/tmplwiz.aspx</A>
*If you
;use Office 2000 or before, the Wizard is on the installation CD, though you may have to run the installation routine again if you previously chose not to install Excel Add-Ins. You should find the sales invoice.xlt template, the nextinvoiceno.xls workbook and a copy of my original article on this month's cover CD.
HBSimon Jones revisits his past to shed light on a spreadsheet issueL
May 2003X
103 (May 2003)i
Applications
cancel^
cancelbots
cancelerror
cancelhint
cancellation
cancelled
cancelling
cancels
candidateB
candidatesz
candidly
canines
canlii
cannabist
canned
cannibalise
canningc
cannon
cannotT
canny
canoe
canoma
canon
canonicalw
canos
canread
company's]
company-branded|
company-facing
company-only
company-specific
company-wideK
company/brand/charit
company/individual
Paul LynchD
Webcam watchdog
This month, I'm taking a break from my usual handheld-oriented routine in order to discuss a project I recently carried out that has a lot of applications besides the security concerns that motivated me. Being a mobile computer user raises plenty of issues and only some of them relate directly to the problems of travelling.
*I'm coming to realise more and more that the support systems you need to leave in place at home are very important. For example, there are times when you have to be aw
d, did so entirely because of their own carelessness.
*You may have noticed I didn't suggest firing the SQL Server administrators themselves, and there's a good reason for this - it wasn't their fault. And before I go any further, I should add that I'm spitting mad at the appallingly low quality of reporting, analysis and so-called expert opinion that has been expressed on the Web, in print and even on TV over this affair. Almost everyone involved should be hanging their heads in shame.
To be sure, Microsoft can't escape entirely from this orgy of finger-pointing either. Yes, the firm issued a patch last summer for clearing up the problem, but only the most generous of persons would claim Microsoft is therefore blameless. The original patch was a bitch to install and, worse still, some subsequent patches appear to have undone the original one. That SQL Server isn't covered by Windows Update is something else Microsoft should ponder, because it ought to be easier than this
to patch core back-office technologies.
*Having said all this, it wasn't Microsoft's fault: a vulnerability was discovered, a patch was issued and database sysadmins decided not to apply that patch. The finger then logically swings toward those DB sysadmins, whom I've already said are innocent.
*Here's why. If you're running a line-of-business database for a large company, the last thing you do is fiddle with it. Applying patches and updates to such a box is done only in the most despera
te of circumstances, because the business risk is too great. Pop a Service Pack or patch onto a desktop PC, and if it doesn't work you have to recover one desktop PC. Do the same to one of the company file servers and the impact is greater, but it's still only one server. But apply a patch to a database server, which is typically a single point of failure, and everyone will suffer from its absence. Increased caution goes hand-in-hand with increased risk exposure.
*As for the worm itself, i
t got into SQL Server and the Microsoft Database Engine (MSDE) found in a number of desktop applications via port 1434, then propagated from that machine to other random machines. The rate of infection was staggering - in a few minutes, major Internet backbones were saturated with traffic. Analysis shows that these weren't line-of-business SQL Server boxes, but mostly desktop MSDE machines whose users were probably not even aware that they had MSDE installed.
*But remember, it had to get
a foot in the door to get into a computer. The vehicle of infection wasn't a dodgy email attachment, nor unpleasant hack code downloaded from a porn website. This virus walked up to the front door, opened it and walked right in, and the only reason it could do that was because port 1434 was open on the firewall to external traffic. Read that again carefully - port 1434 was open on the firewall for externally sourced traffic. Unless there was some exceptionally good reason for this port bei
ng left open - a reason that had been discussed, planned, documented and approved - the person running the firewall had, in my opinion, just signed their letter of resignation.
*There appears to be a widely held view - backed by Internet memes, rumours and accepted wisdom - that ports above 1024 are somehow not important and should be left open to incoming traffic. A similar view holds that every port for internally sourced connections (internal to external) should be open. This is appall
ingly wrong, culpably negligent and dangerous beyond belief.
**Are you 'ard enough?
*So now you get my view on firewalls, and how to harden machines against attacks, internal or external. None of it's difficult or complicated, and no magic or witchcraft is required. Follow these rules and you'll be safe and secure. If you want to break them, go ahead, but please have well-understood and documented reasons for doing so. Breaking them because they're inconvenient should be a sacking offenc
*Rule 1: every port, from 1 to 65536, for all internal IP addresses to every external IP address, shall be blocked to internally and externally sourced traffic. In your starting configuration, there should be no traffic from inside to outside nor outside to inside. Your firewall acts like a 2in air gap in the network cable.
*Rule 2: only open a port for a specified, clearly understood and properly documented reason. Therefore, open port 80 for internally sourced connections coming from
probably all machines and allow them to connect to all outside sites. Ditto for port 443 for HTTPS. If you route all internal web traffic through a proxy server, you needn't allow every desktop to make a port 80 connection to the outside world - only the proxy server does that.
*Rule 3: don't confuse client access with server access. For example, if there's no proxy server, a client will need access via port 80 to the outside world. The email server will require SMTP, both inbound and out
bound. I can't think of any good reason why a client would need SMTP access through the firewall, and ditto for DNS, SNTP, POP3, FTP and so forth. In a well-designed network, clients talk to servers and servers talk to the outside world. This applies for the Web, via an HTTP proxy, to email via the mail server, to SNTP time services via the main time server and so on.
*If you think about it, there are almost no circumstances in which a client should have any direct access at all to the ou
tside world for normal line-of-business operations. And this might seem obvious, but there should be no access whatsoever, under any circumstances, for external traffic to come directly into a client machine.
*Rule 4: VPN tunnels shouldn't be wide open. This is a classic screw-up, and people keep on getting it wrong. If you allow a VPN tunnel from a remote client machine, maybe a PC in someone's home, this is wide open, just as if that machine were on the local network. Do you have any co
ntrol over the state of this machine? What if it's virus infected? What if it's running some nasty spyware application? Do you really want to let this machine have wide-open access to everything inside your network? Of course not, but most people still set up VPN tunnels from home machines to be wide open and then are surprised when nasty things walk straight in.
*There's an even more worrying version of this, which I know for a fact hit companies in the City of London. Say you have a clos
e business relationship with another company and you want to have your database server talk to theirs, and vice versa. The classic solution is set up a VPN tunnel between them - you trust them, they trust you. Now assume that their firewall design is somewhat shaky compared with yours, and they get infected with something nasty. Guess what? It's now on your network too. The solution is stupidly simple - don't leave wide-open connections to unverifiable sources. If this partner company need
s access to your mail server, open up port 25 to their network and leave everything else blocked. Apply a layered security model so that only the right machines get the appropriate access.
*There are some applications that place unreasonable demands on the firewall design. NetMeeting is one such, which demands random ports open to allow it to work. If you need to have someone run NetMeeting to an external source, set up a VPN tunnel and shut it down whenever it's not required. Booking fire
wall access should be planned, just like booking a meeting room.
*These rules might sound paranoid, but they're not - those who followed these basics didn't have a single problem with this latest virus. Advice like 'well, maybe we should block port 1434 now', is staggeringly naive. Last week it was 1434, next week it might be 2096. Who knows? You certainly don't. The only secure policy is one that keeps out every single external TCP/IP packet unless it has a genuine, verifiable and accoun
table reason for being in there, and that every internal TCP/IP packet stays within your network unless it too has a gold-plated reason to be allowed outside.
*Yes, the rules are harsh and, yes, they'll break all sorts of half-baked and half-arsed IT business practices. That's a good thing. Some things in life have to be mandatory and good security is one of them. If your firewall and network people disagree, fire them on the spot. Have courage in your convictions and demand that security
policy is written down and fully documented, completely justified and can be explained to (and independently verified by) a non-technical person.
*The rules don't change for a complex multisite network either - in fact, the need for them becomes ever greater. A corporate client of mine runs these rules as I've described them here and discovered that a satellite office in another country had installed an ADSL line into their local network, which they refused to disconnect. There was only
one possible response from head office - they disconnected that remote office from the rest of the company network until the remote office abided by the security policy. This isn't a game, it isn't complicated and there are no excuses. Follow the rules - or go dig potatoes.
**Trustworthy Computing
*The SQL Slammer worm infected less than 100,000 systems, but it hit most of those in around ten minutes from the moment it started work. The number of Slammer-infected systems doubled every eig
ht and a half seconds, which meant that nobody had time to react. If you weren't already protected, you were infected before you could even start thinking about it properly (unless you simply disconnected from the Internet as soon as you discovered the worm was abroad).
*Could SQL Slammer have been prevented? Yes. Why wasn't it? There are a number of reasons, and Jon Honeyball has covered these above. Failure to apply the Microsoft patch was one, but let's be fair about this - in many case
s, companies had no idea they even needed to apply that patch. It affected MSDE (Microsoft SQL Server Desktop Engine) as well as SQL Server proper, and there were an awful lot of people out there who didn't even realise they had MSDE installed. Check this URL for a list of Microsoft applications that install MSDE: <A HREF="http://www.microsoft.com/technet/treeview/?url=/technet/security/msdeapps.asp" target="_BLANK">www.microsoft.com/technet/treeview/?url=/technet/security/msdeapps.asp</A>
*There were also many people who simply never knew the patches existed. Small companies running SQL Server, or an application with MSDE, might well have been unaware that they even had a problem. Windows Update is well known, of course, but it wouldn't have mattered if they were patched to the hilt - according to the Windows Update site - because it doesn't handle server applications. There should be one site that deals with updates, and it should check your system for all updates. For he
aven's sake, it doesn't take that long. Patch the world please, Microsoft, and get a move on too.
*I spoke with Stuart Okin, chief of security for Microsoft in the UK, and recent appointee to the Information Assurance Advisory Council (IAAC), a government-industry joint body dedicated to creating a safe and secure information society in the UK and abroad, about this update/installer issue, and about Microsoft's Trustworthy Computing initiative in general. He had this to say about security
matters:
*'We recognise that we need to continue to raise the security quality bar of our code. However, there will always be a need to patch software. One of the main focuses for Microsoft over the coming year will be to improve the patch management experience. We will be working hard to reduce the number of installers across our different products and improve the consistency of delivery and quality of documentation. In addition, we continue to improve our proactive and reactive process
es in dealing with security vulnerabilities, alerts and management.'
*Trustworthy Computing is to be built upon four ideals - Security, Privacy, Reliability, and Business Integrity - each of which breaks down further into individual components; namely, Design, Default, Deployment and Communications. Thus, you can examine Security by Design, by Default, by Deployment and so on, and can also examine Privacy and the rest under these same headings. Trustworthy Computing is supposed to be a two
-way street in that you get to make decisions on how much or little of your personal information should be made available. This falls under the Privacy heading, as does control over what people are able to do with your information should you choose to pass it on.
*Security means that systems must be better protected than they have been up until now, and that mustn't mean you need new systems - it means Microsoft has an aim to ensure that its software will be more secure out of the box tha
n it has been so far. Reliability is the term being applied to describe consistent delivery of safe and secure software that doesn't fail on you, and Business Integrity means that Microsoft will be, and I quote, 'clear, open, fair, respectful, and responsive to customers and the public.'
*In my opinion, that last requirement also means Microsoft has to be far more proactive in ensuring that patches are made available in a more timely manner. Communications means that the patches shouldn't
promptly break something else, and Reliability means that my systems be as safe as they possibly can even before patches are applied (remember that SQL Slammer was totally ineffective against systems with properly configured firewalls, even if the software hadn't been patched). Security ought to mean that I don't need to reboot my line-of-business servers every time I apply a patch and I'd like that to fall under the Default heading please, though it fits equally under Design and Deployme
*Business Integrity? Everybody has different opinions about this one. Do you trust Microsoft or not? Let's face it - if you use the firm's software, you already trust it. Microsoft has recognised the need for far superior security, far superior software, far superior deployment and more. Recognising the need is one thing, writing documents about it is another, but actually doing it is the thing that counts, and this is what will eventually decide Microsoft's fate in the worldwide softw
are market. The key is speed. Stuart Okin says Microsoft will be working hard over the coming year to try and reduce the number of installer technologies. That's great news, but while you're working on that please bring the checking technologies under one roof straight away.
*Create a new Microsoft Update and have it check everything on a system. Even if you get directed off to the different installer mechanisms that are currently in place, having the checks done in one visit will provide
a huge jump in the numbers of patched systems worldwide, if only because people will now find out about all the patches that are available.
*As I said, trust works both ways: you need to use Windows Update to check for patches to the OS; you have to go and visit Office Update; you need the Microsoft Baseline Security Analyser (MBSA) to check your servers; you need to regularly visit the Microsoft security website. If you don't do any of these, and therefore don't learn how to properly cona_figure your firewalls, you are as culpable as anyone else for whatever new exploit comes along.
HXTrigger-happy Jon Honeyball polishes his pistol, while David Moss is feeling TrustworthyL
May 2003X
103 (May 2003)i
Back Office
that'sA
neitherG
nektulosu
nelson
nemesisN
neo-tek
neolithic
neomar
neon[
neon-green-on-black
neon-style
neoplanet
neoprene
neotrace
neoware
nerds
nerdyg
neriaku
nero's
nerve
they'reA
I+O+O+O+O+O+O
O+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U{
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O
+O+O+O
+O+O+
+O+O+O+O+OO+O+
O+O+O+O+O{
+O+O+O+O+O+O+O+O+O+OO+O+O+O+O+O
O+O+O+O+O+P*P+O+O+O+
+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+z
+O+O+OO+
O+O+O+O
O+O+O+O+O+O+O+O
+O+O+OO
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O{
O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O+O+O+O
+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+
+O+O+OO+
O+O+O+O+O+O+O+O
%U%UI1I1
+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+
O+O+O+O+O
+OO+O+O+O+O+O+O+O+O+O
OO+O+O+O+O+O+O+O+O+O
ensuredA
ensuresC
ensuringV
ent/introduction
entail
entailed
enteringV
enterpriseB
enterprise-based
firmlyD
firmsL
firmwarej
firmware-upgradable
bringingI
bringsK
brink
brinkster
bugbears
buggingP
+O+O+O
O+O+O+O+O+O+O+O
+O+O+O
+O+O+O+O+O+O
O+O+O+O+
O+O+O+O+O{
+O+O+O+O+O+O+O+O
O+O+O+O+O
O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+OO+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+z
O+O+O+O+O+O+O+O
O+O+O
+O+O+O
+O+O+
+O+O+OO+O+O+O+O+O+O+
O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%+
O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+Oz
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O
+O+O+O
+O+O+
+O+O+O+O+OO+O+O+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+OO+O+O+O+
O+O+O+O+O
+O+O+O+O+O+O+O+OO+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+
O+O+O+O+O+O+O
%UI1I1I1I1
O+O+O+O+O
+O+O+O+$
+O+O+O
+O+O+O+OO+O+O+O+O+O+O+
+O+O+
+O+O+OO+O+O+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+Oz
+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+
O+O+O+O
ay from home or the office, with no-one left behind in residence, and you want to have some way of keeping a check on the security of your premises. This obviously applies to both business trips and holidays, but can be equally useful for much shorter time intervals, even for routine overnight stays or daytime absences. Ordinary security alarms are fine, essential even, but any extra security helps (we all can call upon cases of house and office alarms being ignored for hours by passers-by
*The most obvious solution to this problem is via image capture, but this isn't easy to implement. A cheap webcam takes low-resolution snapshots pretty easily, but it's dumb, requiring an attached PC nearby to be left powered-on to run the camera software, which is too easily disrupted to be reliable. Also, a webcam leaves the captured images on a PC in the office - either the attached one or a file server - which are precisely two of the assets it should be trying to protect. What's m
ore, accessing these images remotely for peace of mind isn't easy unless you like poking holes in your firewall. My rule with firewalls is don't leave any holes through to the core system for any reason, even though reconfiguring the firewall to allow that would be a breeze (for me).
*It's certainly possible to find PC software to run with a webcam that intelligently captures images and loads them to an external web server, but that's still a less resilient solution than having a dedicated
intelligent camera (if only because I might switch off the PC by mistake). Given that such smart cameras are available for much less than most managers' sign-off limit, this isn't a hard choice to make. I can see no reason to tie up a PC with a dumb webcam any longer.
*I've been using an Axis 2120 Network Camera, chosen because it's smart, features motion detection (very useful) and was recommended by friends in the computer security business. Plus, it comes from a company with a decent
reputation for dedicated network servers (<A HREF="http://www.axis.com" target="_BLANK">www.axis.com</A>). The Axis 2120 features a network port, a serial modem port (if you need one), a power connector and an adjustable stand. You can get a weather-proof housing if required, but you can't, yet, get it with built-in wireless networking. However, since you'll have to run a cable for power you may as well run a network cable alongside.
*Internally, the 2120 runs a version of Linux and a min
i-web server, and with even a bare-bones installation you can instantly capture still images or motion JPEGs remotely via your web browser. However, you should be aware that the video works with Windows Internet Explorer, but not in a reasonable way with anything else I use (including the Mac version of IE).
*Getting to this first stage is pretty easy: unpack the camera and plug it in, then type in a command line on any PC or Unix machine to set the IP address for the camera (using its MA
C address, which is printed on a sticker on the bottom of the case). This involves creating an arp entry on your machine and associating the camera's MAC address with whatever IP address you want, so that when it boots up the camera can detect that it has been allocated an IP address and use it. For example: arp-s 10.0.1.245 00:40:8c:10:00:86 10.0.1.245 temp
*The camera also supports BOOTP and DHCP, which you can use if you prefer, but the arp technique is just as fast. There's also a rath
er more complicated way of setting it up using a null modem cable, but that is a lot more long-winded.
*As soon as this is set up, you can access the camera's web server and see whatever it's seeing. Just point a web browser at the camera's IP address and, if it shows an image, it's working and you can start adjusting the system. The same web page is used to configure the camera to get the best possible picture settings, to activate motion detection and to define some accessible location
to which to upload the pictures. One warning: the Axis cameras all ship with a default username of 'root' and a password of 'pass' so change them immediately!
*On the front page, you should see a link to the administration tools pages, which are password-protected. Before you do anything else, select the links System, then Users and reset the password for 'root'.
*The next step is to set the camera to only capture images when it detects motion in the target area. In most images, there'll
be a lot of irrelevant fluff and only a relatively small area that really matters. You could leave the camera sensing changes over the full image, but I find that triggers more captures than you want. As it is, even changes of light level can trigger a capture, and I was surprised to find that sunrise through a relatively small window was enough to set it off. You should expect that switching artificial lights on and off will also trigger it, while relatively rapid motion through the field
of view - such as someone walking quickly past a doorway - won't.
*Knowing this, you can plan how best to place the camera so that significant events are always captured. As with any system of this kind, you must test it after it has been set up to be confident it works properly - run it through a full 24-hour cycle at least once before leaving it to its own devices.
*I found that a long hallway was the central point for all movement in my house, and setting up a single camera pointing a
long the length of the hall was enough to detect any unauthorised entry, unless it was confined to a single room - it just isn't possible to move from room to room without triggering the camera several times. I had to restrict the view to avoid car traffic outside causing false alarms, but in the event I still wasn't careful enough and my testing wasn't thorough enough - letters dropping through the mailbox set off the alarm, for example.
**Poetry in Motion
*Configuring the motion detecti
on settings is easy. First, go to the Applications | Operation Selection menu and choose Alarm mode so that it will capture images only when triggered, then to the Scheduler menu to set the trigger on 'input rising in the default window'. For motion detection in a subwindow, go to the Applications | Motion Detection menu and use the Java applet to select up to three subframes from the image inside to check for motion. Be careful that you don't move the camera after doing this.
*I keep my c
amera firmly behind a NAT router, which in turn is behind a normal firewall, so there's no way that any external access can reach as far as the web server on the camera (nor would I want it to). Hence, I don't need to customise the internal web pages on the camera, but instead need to find an external server. The Axis supports ftp export of files so an ftp server is the obvious route, but I wanted to use Mac.com as I have an account there with enough space free for a reasonable number of i
mages. However, Mac.com only supports WebDAV, not ftp for file access. To work around this, I set up a fileshare on one of my servers, which mounted my Mac.com iDisk on the server, and set the Axis camera to connect to my server via ftp.
*The ftp location maps to my public folder on Mac.com, which I browse via my laptop when away from home to see any images captured. The filenames include a time stamp, so I have reasonable information about when events happened. Although it's called a 'pub
lic folder', I can set access rights on it to require a password to browse files. And if I wish, I can provide web access to the images via the Mac.com home page (although I can't do both of these at the same time).
*Setting up a simple security system using a smart network camera is easy, taking around an hour from opening the box, and there aren't many catches. I still get false triggers from the post arriving and from sunrise at certain times of year, but at least they reassure me that
the system is working. Extending it to permit door monitoring and other applications would be just as easy, and with the audio module that's available I can imagine sitting in a hotel room in Texas having a very confusing conversation with the milkman about why I can't pay his bill right now.
**Airport Extreme
*Last month, I mentioned that I believe the next move in wireless networking is going to be to upgrade from 802.11b to 802.11g, but I didn't have much evidence beyond a gut feeling.
As so often happens nowadays, real-world events have overtaken me, even before that column hit the street. Apple announced at the beginning of January that it would be updating the ever-faithful AirPort to support 802.11g and would ship the new 17in PowerBook with 802.11g support built in. This offers a vast speed increase at the cost of a minor reduction in range and, given that Apple has said it will be moving the aerial from the PowerBook's base into its lid, this range reduction shoul
dn't have much impact.
*In fact, it's likely that, for most applications, a PowerBook will now seem to have much better range. The AirPort Extreme base station, card and 17in PowerBook should all be shipping by the time you read this.
H?Paul Lynch examines the highs and lows of home security imagingL
May 2003X
103 (May 2003)i
Mobile Computing
Brian HeywoodD
Audio on the hoof
When I was in Melbourne last year doing some recording, my reliable, though aged, Dell Latitude Portable PC got fried by a lightning strike. Since Dell wasn't prepared to support such an old machine, I had to get a replacement at pretty short notice and ended up with a Compaq Presario, happily 'on special' in the local computer emporium. While it doesn't have a particularly whizzy spec compared with today's state-of-the-art notebook PCs, it's far more powerful than my old Pentium/133 lapto
OO+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+P
+O+O+OO+
O+O+O+O
O+O+O+O+O+O+O+O+O
+O+Oz
+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+O+O+O+O*+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+P
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1
nobody_
noborder
nobrainer
nochex
noctilux
nodded
noddingA
noddy
nodeh
node-based
nodebug
nodesI
nokia-basedu
nolinerental
nomad
nomadworld
nomenu
nominal
nominally_
nominatex
nominated
nominatingx
nominee
nominet
nominet's
non-acpi
non-active
non-alphanumeric
non-appleD
non-automated
non-bt
non-cabled
donaldL
donate
donatedE
donationc
doncaster
href`
hresults
hsbc's
hscm11
hscsd
hspace
+O+O+O+O+O+{
OO+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+
O+OO+O+O+{
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O
I+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O
I+O+O
+O+O+O+O+O+O+OO+O+O
O+O+O+O+O+O+O
+O+O+OO+O+O+
O3+OO+
+O+O+Oz
OO+O+O
O+O+O+O+O+O+O+O
%OO+O+O+
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O
reachableF
reachedD
reaches
reachingF
reactK
reactedC
reacting
reactionO
extending`
extends
extensibility[
extensibleT
extensionB
extensis
extensis/creativepro
O+O+O+z
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+Oz
+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O+O+O+
+OO+O+O+O+O
+O+O+O+$
+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
+O+O+z
+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+OO+O+
p and has a much bigger screen as well.
*I used my old portable mainly for office tasks like word processing and collecting my email, but it didn't have enough power to be considered for any kind of serious audio tasks, apart from MIDI sequencing and perhaps a bit of simple WAV file editing. This was as much due to its dearth of hard disk space (800MB) as to its relatively modest processing power. I did use it on a number of gigs to drive a Yamaha MU-10 MIDI module, basically to provide ba
cking tracks for an occasional electro-acoustic duo act I did with cellist Saskia Tomkins.
*However, MIDI is a far more compact way of representing music than a digital audio wave file, which uses up 10.5MB per stereo minute. Given that my steam-powered portable PC never had more than about 100MB of disk space going spare, this would have allowed the grand total of ten minutes total WAV recording capacity.
*Now, I have a powerful PC with over 3GB of available hard disk space, and that's a
fter all the music applications have been installed. You might think I've got the core of a nice little portable multitrack recording system, but sadly it isn't as simple as that.
**Expand the mind
*One of the great advantages of a normal desktop or tower PC is that it's easily expandable. Simply drop a specialised PCI interface card into one of its internal slots and you can convert your mundane business PC into anything you fancy. In my (or rather my PC's) case, I've rehabilitated a num
ber of older machines this way to perform useful tasks in the studio. However, as you'll know, when it comes to portable PCs this useful expandability isn't nearly as easily available due to the physical constraints of the packaging. This means you have far fewer options when it comes to using a portable PC for anything other than normal business applications.
*A fundamental problem is that the audio hardware fitted in most portables is usually pretty dire - typically a basic chipset integ
rated onto the motherboard. The connection options are also limited, because connector space is at a premium on the cramped rear panel of your average notebook PC. For instance, my Presario 1200 only has a microphone input and a headphone output. The upshot is that you'll need to buy some dedicated audio hardware add-ons if you want to approach anything near serious music-making on your notebook.
*Unlike with a desktop PC, you won't have access to the large number of reasonably priced PC
I audio interfaces that have become available over the years. It's a real shame because the PCI bus is ideal for high-performance audio, as it can handle very high data rates. These cards are just too large to fit inside the confined spaces of a notebook interior. All is not lost, though - there are options available that plug into PCMCIA/CardBus slots (nowadays called PC Card for added confusion), USB and FireWire ports. Both PC Card/CardBus and USB ports are pretty common on notebooks, w
ith FireWire increasing in popularity recently, especially on more expensive systems (and Apple iBooks).
*I'd rule out USB for audio, for the reasons I discussed in my March column last year. While the data throughput of the USB bus in its fast mode (12Mb/sec) is high enough to handle multichannel audio, both the latency and the fact that the audio device has to compete with other USB devices for bandwidth mean that it would be difficult to ensure good results using this type of interface
. 'Latency' in this context refers to the interval between the time some real-time event actually occurs - in this case, the sound you're trying to record - and when the computer detects that event. With audio data, delays much above 5ms are noticeable and can seriously affect the usability of the system.
*Using a CardBus interface looks far more promising, since it has a much more direct route to the underlying PC hardware. CardBus is a 32-bit data bus, which is a development from the 16-
bit PCMCIA interface standard. PCMCIA stood for Personal Computer Memory Card International Association and was originally designed as a convenient way to add memory to portable computing devices. However, it didn't take long for developers to realise that it could be used for more general interfacing tasks. When the specification was upgraded to 32-bit, CardBus' designers took this interest into account and enabled it to handle data rates up to 132Mb/sec and offer similar facilities to t
he standard PCI bus.
*The final option is the FireWire interface. The latest version can handle data transfer rates up to a maximum of 400Mb/sec while still retaining the ease of connection that you get with a USB system. The standard was originally developed by Apple, but has been taken up by other manufacturers such as Sony (which calls it i.LINK) for transferring digital video between systems, and it has been adopted by the IEEE.
*Once this interface becomes almost universal, there sho
uld be a big market in interfaces that will bring down the price of peripherals considerably. To take advantage of the higher data rates, your notebook PC needs to have FireWire integrated into the motherboard.
**Nuendo Multiset interface
*I've been using Steinberg's Audiolink 96 Multiset interface, designed to augment the firm's Nuendo integrated multimedia production environment, which has a CardBus interface. This hardware is basically a rebadged RME product and can be used with any so
ftware that works with either Windows or Steinberg's ASIO interface standards. In terms of physical connections, the unit provides eight analog audio inputs and outputs - balanced or unbalanced - along with an additional eight inputs/outputs via ADAT Optical Links, plus S/PDIF, word clock, ADAT synchronisation and MIDI-In and Out connectors.
*If you want to delve a little deeper, the Multiset is equipped with 24-bit converters, with sample rates up to 96kHz - it's bang up to date with reg
ard to audio quality. The whole package fits inside a small but robust box (45 x 215 x 120mm), which is essentially a half-width, 1U rack unit. It has a number of power supply options, with an external mains power supply plus cables for connecting the unit to a car cigarette lighter or 12V rechargeable battery. RME says that it thought of powering the unit via the CardBus interface but that its power requirements were beyond the interface specification. To be honest, this isn't the sort of
device that you'd operate in environments without some form of auxiliary power being available.
*The Multiset connects to the Nuendo CardBus interface via a high-speed serial cable, which turns out to be FireWire compatible. This means that if my portable gets fried by lightning again, I'll be able to plug the interface directly into the replacement PC, since FireWire should be fitted as standard by that time. It also raises the interesting prospect of equipping a studio PC with the PCI A
udiolink 96 interface so you could share this module between studio and location recording tasks.
**System configuration
*Multiset's eight line-level inputs and outputs are accessible via 1/4in jack plugs, which can be used in either balanced or unbalanced mode - its audio circuitry automatically detects the type of connection and compensates accordingly. Having only line-level inputs and outputs means you'll have to use the interface with a mixing desk of some kind, especially if you nee
d equalisation or level-control features.
*I could quite easily dedicate this whole column to the various options for doing this, but to be brief, unless you have a mixing desk designed for multitrack recording, the easiest option is to use either the direct channel outputs or the channel insert points to tap into the individual audio signals before they get mixed down to stereo.
*On the Mackie CR-1604 mixing desk I use for live work, I can use the insert points on the first eight channe
ls to do this. Obviously, the quality of the mixer will have a major impact on the quality of the recorded sound, but so long as the desk is electrically quiet, you should be able to get good results. The Mackie CR-1604 - for all its other shortcomings - gives a very 'clean' sound, which makes it a good choice for this kind of application. The Multiset also provides a good set of interface-level options on internal jumpers, which allow you to match its input and output signal levels to tha
t of the mixer.
**Software issues
*To be able to use this kind of hardware, your audio application needs to have two important features. The first is that it should be able to playback audio in some kind of multitrack mode. This is equivalent to the 'arrange' window common in sequencers like Cakewalk or Cubase and allows you to display and record the new audio alongside existing tracks. The other requirement is that the software has to be able to record multiple tracks simultaneously, whi
ch should be true of all but the most basic of audio packages.
*Although the Multiset is obviously packaged to augment Steinberg's Nuendo software, there's nothing to stop you using the device with any software that can recognise and use multiple audio inputs. As I don't have a copy of Nuendo, I've been using the Multiset with Cool Edit Pro in its multitrack mode, and I also had a brief excursion into using it with Cubase SX.
*Despite it being an older version of the software, I found tha
t Cool Edit Pro worked well, giving an intuitive recording session feel, simply by selecting the desired input channels and then putting them into record mode.
*Cubase SX was also simple to set up, but it obviously needs a more highly specified PC than I have, since it wouldn't reliably record four simultaneous tracks without losing data. This difference in performance may be related to the fact that Cool Edit was originally written for the PC, whereas Cubase has been ported from the Appl
e Mac. I plan to see how SAWStudio and Cakewalk's Sonar behave, since they're both dedicated PC packages.
**NoExcuses guides
*At last year's Mad About Music show at Birmingham's NEC, I picked up a couple of PC CD-ROMs that were promoted as a comprehensive guide for guitarists and bass players. Ever since I started writing for the computer press, I've been on the look out for effective multimedia publications that take true advantage of the technology, and these two CD-ROMs come pretty clo
se to it - just slip the disc into your CD-ROM drive and a useful little multimedia guitar handbook pops up.
*While these publications could never be described as being encyclopaedic in their scope, they do have a wealth of useful information, utilities and 'music minus one' jam-along features. For instance, there's a handy little transposing utility, plus tips and tricks about the care and feeding of your guitar or bass.
17.99 each, they seem like a bargain to me. See <A HREF="http
://www.noexcusesguides.com" target="_BLANK">www.noexcusesguides.com</A> for more information.
**Tips & tricks
*One useful trick when using a digital delay in a mix-down session is to select a delay time that's related to the tempo of the track that you're recording. This technique can add 'punch' to the track and prevents the delay from making the mix muddy, because it tends to reinforce the rhythmic elements rather than competing with them. While it's not a particularly onerous task to
calculate the required delay, I've just come across a little freeware utility called Delay Calculator on the Steinberg support website.
*Written by Ian Price, this useful little program lets you enter the tempo, the note type (crotchet, quaver, minim and so on) and then displays the time delay that this implies. You can even select a triplet time. The entire program has a file size of just 340KB and so won't take long to download, even if you're using a modem. Just pop it on your desktop
and fire it up whenever you need to work out a delay.
*You can download the application from the Steinberg FTP Server at <A HREF="ftp://ftp.steinberg.net/download/pc/dtc/DELAYC_1.EXE" target="_BLANK">ftp://ftp.steinberg.net/download/pc/dtc/DELAYC_1.EXE</A>
H`Today's portable PCs have more computing power than ever. Brian Heywood shows how you can use itL
May 2003X
103 (May 2003)i
MULTIMEDIA:AUDIO
Steve CassidyD
eBay inferno
EBay is, I can exclusively reveal to PC Pro readers, the spawn of the devil. Say you work in a small company with a small network. Well, you're only about three mouse clicks away from an eBay listing that's crammed full of the choicest plums from the last generation of major corporate network equipment. Stuff that normally arrives on a pallet or, at the very least, with an engineer can be yours for the plucking at something like a tenth of the original retail cost.
In 1637, the French philosopher and mathematician Ren
Descartes wrote in his Discourse on Method that 'I think, therefore I am'. Had he been alive today and writing a discourse on Internet marketing methodology, there's a good chance he'd change that to 'I don't think, therefore I spam'.
*Last month, while writing about spyware, I hinted that I'm not convinced much thought is given to demographics when the average new media marketer lets rip with whatever bulk mailer is the flavour of the
ions: great wobbling piles of Compaq ProLiant 3000 servers for
400 each; an HP 4000M (a modular Ethernet switch able to mix and match fibre, copper, 10 and 100Mb, all blindingly fast) for
500; and Dell PowerEdge servers at the same price, whose original ticket value would have been
7,000 or more.
*Most products aimed at the 'small business' marketplace have always been nasty, underpowered things with features chopped off according to a marketing rationale hellbent on preserving sector d
istinctions. So we get small business servers with IDE disks, and it's extra if you're in a medium-sized business and 'deserve' a RAID controller. Of course, the larger enterprises need Storage Area Networks, and that's Ferrari money before you even get started.
*I've devoted many words in this column to telling people how I think this is dreadfully misconceived, because smaller businesses actually work their hardware far harder than big businesses ever do. In a large operation, it's one j
ob per server, whereas small companies seldom have the luxury of running three or four separate machines to move their email, serve their files and dish up their SQL data.
*For these reasons, I approve, in theory, of having access to recently retired 'big iron' equipment. Unless you're doing vast SQL runs or holding hundreds of thousands of word-processed documents online, you'd be hard pushed to tell the difference between a dual 700 and a dual 2.2GHz Xeon in a moderately busy network. An
yway, the hardware that pushes the data to you isn't the processor of the server; it's the RAID controller.
*A three- or five-year-old RAID card that cost
1,500 new will outperform a modern IDE drive in a multi-user scenario, because that's what it was built to do. Better still, pairing up modern SCSI drives with these older controllers gives the performance even more of a boost. It will also allow you to sleep better at night instead of relying on disks that - in the case of one server
I recently decommissioned - have been running with their disk access light permanently on for the last two and a half years.
*What's so wrong with pillaging eBay then? Why am I not an unreserved fan? Because the process of making worthwhile use of a large last-generation server isn't as straightforward as it might seem. There are several parts of the equation that won't jump out at you from the auction pages, and even though the majority of these systems will come to you from specialist
remarketing and maintenance companies, it's surprising how many little necessities still fall through the cracks:
*Drive dilemmas While I did say you can usually upgrade the SCSI disks in an older server, a huge number of 'corporate' machines like the old Compaq ProLiant 3/5/6/7000 series only ever ran with two or three SCSI disks installed. The drive cages for these are model-specific, or at least range-specific. Only quite recently has Compaq gone over to a completely consistent size and
fitting for its drive cages, and Dell still hasn't got there, actually going backward in some ranges by becoming even more inconsistent. Yes, it's possible to take an old drive enclosure, rip the mechanism out of it and install a bigger disk, but if you want five or six 18GB disks, and the machine you buy for
500 has only three drives, be prepared to spend up to
150 per enclosure to get what you need from the spares departments.
*Configuration conundrums Bigger servers deliver their hig
her performance and resistance to load-related slowdowns by using custom hardware that diverges from the de facto standard embodied in your server OS (Windows NT or 2000) setup CDs. Such divergences can be extreme and a royal pain to identify, which is why server vendors make your life easier by including setup CDs with their servers. After several years in a computer room, then decommissioning and bouncing around in back of the 21st century equivalent of Old Man Steptoe's knacker-cart, th
e chances of that setup CD still being with the server are slim at best. Oddly enough, while there are active hardware spares markets for most leading brands, none of them make the effort to keep these setup CDs available, and they're seldom downloadable from vendor sites.
*Memory I lose track of this issue, because I'm so frequently immersed in it, and every so often when I come across a genuinely small network it jumps out to bite me again. Real servers tend to use registered or ECC (or
both) memory to guard against certain rather theoretical types of failure as well as to run faster than you'd expect in conjunction with all that custom hardware on the motherboard. So you can't just feed these dinosaurs the same kind of memory as you put in most of your workstations. In the case of my favourite fossil server, HP's Netserver LC3, if you put in non-HP DIMMs, even of the right specification, the BIOS spits out disapproving messages on startup about your warranty being in per
il. Of course, the machine still works, but it would be anathema to those contract-writing support types who buy and use these servers from new.
*Processor pains Incredible as it seems if you've ever run a dual-processor machine, few of these ex-corporate boxes come with their full possible complement of CPUs. Considering that multiprocessing is the main reason for distinguishing between these and jumped-up workstations, that's unbelievable but true. Lots of large corporate server purchase
rs are unfamiliar with the idea of multiprocessor computing (which maybe explains why they buy so many servers).
*But the upshot for a prospective second-hand buyer is that you have to go out and buy a processor for yourself more often than not, and some of these are in short supply. Worse still, Compaqs, HPs and several other brands have plug-in Voltage Regulator Modules, and if the second (and subsequent) CPUs aren't fitted, you're back to eBay or some other source for that missing part
. And prices for these parts - as for extra disks, if you go back to the 'official' channels - can be steep. A Xeon 700 with 2MB of cache, which on paper ought to be a museum piece by now, can sting you for over
500 because it will go into and speed up so many of these 50kg monsters.
*Heat stress This sounds trivial, but it's not. The older servers consume so much power they expect to be fed from two or three mains outlets, as well as having all that heat conducted away, preferably by a h
ulking great air-conditioning unit. Some of the earliest rack-mount, slimline Compaq servers, for example, will shut down on heat-protection grounds if you run them with their lids off for more than about 20 minutes - the lid is an integral part of the cooling system.
*Passwords Yes, really. I've had plenty of plaintive emails from people who've bought big network switches in particular, asking me how to reset the passwords on the management interface. There's no guarantee that you can get
rid of a password on any particular piece of kit. Don't forget the cautionary tale of IBM ThinkPads that have BIOS passwords so uncrackable that those belonging to forgetful users become instant landfill - it isn't a myth, and isn't restricted to ThinkPads.
*If you can't get into the management and configuration console of a switch that offers such a thing, those ports that have been set to custom configurations won't be modifiable. They may not autonegotiate, or they may be teamed toget
her in some ancient implementation of a VLAN or fault-tolerant interconnect, and you won't be able to wrench them free of that mode by putting 40kg of network hardware in a meat freezer overnight.
*The final indignity with eBay is that auction prices often manage to creep right up to the actual value of a new replacement, but the auction experience can carry you away with the competitive enthusiasm of the chase. You need to factor in the growing presence of secondary channels too. In parti
cular, I have in mind the Dell Outlet. Sneak through the Dell website (making sure you start at the .co.uk version, not the US one and avoid the 'hot offer' graphics) until you can see the Dell Outlet button on the left-hand side. There's a not-so-slick list that lets you click on the machine specifications - but doesn't go any further - of servers at prices that overlap thoroughly with the top end of the traders on eBay.
*Looking at the situation from the other end of the telescope, I've
recently been working on a server room overhaul in the sort of environment where all that eBay 'stock' comes from. It's the kind of space in which you need to know not just what your server is called, but also its aisle number and cabinet ID just to get the keys to reach the back and turn off the power. Here, we wound up with four servers, even though we had only installed two in the dim and distant past, but we replaced even our initial two with just one: a full-size Compaq rack-mount 570
with twin 2GHz Xeons. The advantage of this replacement was much more than just the simple increase in speed: the temperature sensors in the rack showed that the new server ran several degrees cooler than the old ones could manage while supporting nearly four times the original planned throughput. So what's all this got to do with the Dell Outlet?
*Simply that for another project a few days later I was pricing up the same size of Compaq for maximum expandability and the same duration of
service contract we'd got used to, but yet another project reared its head and I happened to take a look around the Dell Outlet. There was a PowerEdge 2650 - dual Xeon 2.2, RAID controller and some disks - for
1,800, which is only
300 more than the price of the RAID card alone fitted to those other Compaqs now hidden in aisle 13, racks 11 and 12, floor 6 of that other server room.
*Before someone ritually murders me for over-buying on one job and under-buying on the other, the Dell is a
much more limited beast than the Compaq: it's a 1U rack-mount device with fans worthy of NASA to cool the Xeons, and three drives is all you're going to get inside it. In a big LAN, it would have a limited range of roles, but in the smaller LAN it now lives in, it's almost the fastest machine on the roster.
*The moral of all this is that it's less and less sensible to follow the vendor's advice about what you ought to be buying for your designated 'business sector'. As hardware speeds risW
e and the recession fills up the marketplace with overstock and failed orders, it's interesting to see what these unconventional sources can offer you, but not without adequate research.
HNSteve Cassidy looks at the unwanted surprises that Internet auctions may bringL
May 2003X
103 (May 2003)i
Networks
%U%U%UI1I1I1I1I2H2I1I1I1I1I1I1I1I1I1I1I
1I1I1I1I1I1I
O+O+O+O+*
,N,O+OO+O+O+O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+
O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+P*P
O+O+O+O+O+O+O
+O+O+O+OO
OO+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1I
+O+O+O
+O+O+OO+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1
I1J0J1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1J0J1I1I1I1I1I1I1I1I1I1I1
I1I1I1I1I1I
O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%+
O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+Oz
+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+O
%U%U%U
+O+O+OO+O+O+O+O+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
+O+O+O+OO+O+O+O+O+O+O+O
pluckingb
plucky
plugD
plug-in's
plug-in-based
plug-in-specific
plug-in-supplied
plug-insH
exclusively
reveal
readers
spawnb
ebay's
ebay-style
ebayavail&ebaytag1co
ebayers
ebaytag1
ebookH
ebooksW
ebrahimi
ebullientT
ec-wide
ecards
ecats
ecbill
eccentric
eccentrica
eccentricitiesl
eceilidh
ecf-42v
echoed
echoes
echoing
echos
echr's
edging
edict
edificeY
edirectory
edirol
editI
ethernet-connected]
ethernet-equipped
ethernet-presentedY
ethernet-to-dsl
ethernet/fibre
ethernets
ethic
ethical
ethnic
ethos
etiquette
etoys
etracks
etype
eu-wide
eudora
eudora's
smallestB
smallicon
smallish
smalltalkM
smartQ
smart-2dh
smart-arse
smart-ups
smartboard
smartboardxp
smartcellect
smartclip
smartclips
smartcode
smartcode's
smartcodesoft
smartcreds
smartdisk
smartensk
smarter
smartest
smartglove
hardlyD
convenienceH
conveniences
convenientH
convenientlyd
conventionX
conventional]
conventional-looking
converterJ
converter's
convertersN
theoryb
theory's
,N,O+OO+O+O+O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
I+O+O+O+O
+O+O+O+O+O+O+O
O+O+O+O+O
OO+O+O+O
+O+O+O+O+O+O
+O+O$
O+O+O+O+O+O+O+O+O
+O+OO
+O+O+O+
+O+O+O+O+O+O
I1I1I1I1I1I
+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O
+O+O+OO+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O
+O+O+O+O
+O+O+O+O
O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
O+O+O+O
+O+O+OO+
O+O+O+O+O+O+O
%UI1I1I1I1I
+O+O+O
+O+O+OO+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1I1I
1I2H2I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U
+O+O+O+OO+O+O+O+O
+O+O+O+O+{
+O+O+OO+O+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O+O
+O+O+O
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O+O+O+O
+O+O+OO+O+O+O
+O+O+O+OO+O+O+O+
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1
I1J0J1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1J0J1I1I1I1I1I1I1I1I1I1I1
I1I1I1I1I1I
O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%+
O+O+O+O+O+O+O
+O+OO+O+O+
+O+O+O
+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1I1I
1I2H2I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1I1I1I2H2I1I1I1I1I1I1I1I1I1I1I
1I1I1I1I1I1I
O+O+O+O+*
+O+O+O
I1I1I1I1I1I1I1I1I
+O+OO+O+O+O+O+O+O+O+O+O
OO+O+
+O+O+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+
+O+O+OO+O+O+O+O+O+O+O
%U%U%U%U
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+
O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+
+O+O+O+
O+OO+O+O+O+O+VO+O+O+O+O{
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O
O+O,N,O+O+O+O+O+O+O+O+O+
+O+O+OO
+O+O+OO+O+O+O+O+O+O+O+O+{
+O+O+OO+
O+O+O+O+O+O+O
+O+O+O+OO+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+MO+O+O+OO+O+O+O+O+O+O+
O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+
+O+O+O
+O+O+
underliesA
underlineh
underlinedy
underlinesh
underliningU
underlyingH
understands~
understatedA
understatement
understoodA
undertake
undertaken
vendorsT
vendorweb
veneer
venerable
venerate
venerated
venerators
venetian
vengeance]
venice
venkatesh
venom
ventilated
ventilation
venting
internetb
internet-searching
internetmeister
internetmeister
davey
winder
doctor's
into]
intones
intranet
intranets
intriguing
introduces
introducing
intruders
invents
investigate
investigatesH
investigating
investigation
investigatory
inveterate
invites
involvedB
ipaqs
ipcalc
irate
ironed
irrational
Tom ArahD
Gamut warning
Computer-based design for print looks as though it should be simple, as all it involves is moving the colours you see on screen onto paper. But, as anyone who's tried producing commercial print knows, controlling colour is anything but simple - disappointment is the rule rather than the exception. Why is it so hard, and how can it be made easier?
*First, you need to think about exactly what colour is. Your perception of colour depends on the way your eye's photo-sensitive receptor cells -
day. Indeed, the people who try to sell these spamming tools use a scattergun approach themselves - promising 10 million email addresses, often more - which is a good indicator of how little these messengers understand their medium.
*Having a high(ish)-profile email address doesn't help, as I get picked up by the spambot trawlers due to mentions on various websites over which I have no control, and the idiot brigade find it amusing to use my names when signing up for their free porno tour
or registering for some forum or other. On an average day, once the US has woken up, I see a steady flow of 200 to 300 spams - that's around 2,000 per week. Being just an 'ordinary' Internet user doesn't hide you from the menace, though, as a friend who is careful about the addresses he uses and where he uses them tells me he gets around 100 spams a day.
*Junk email is becoming a serious threat to the way we work and play online, especially when so much of it's obscene, fraudulent or bot
h. While I'm fairly easy-going, I'm also the father of four kids aged from three to 15 of both sexes, and I do everything I can to protect them from seeing this stuff. The 'censorship is evil' brigade must be living in a parallel universe if they believe that a five-year-old boy or 13-year-old girl aren't harmed by explicit and sexually abusive material arriving unannounced in their mailboxes.
*I challenge anyone to admit they'd do nothing if sexually explicit brochures for hardcore mater
ial arrived through their letterbox, addressed to their kids, and the contents spilled out onto the floor as you picked them up. But that's exactly what's happening with spam right now - it's not targeted and uses that scattergun approach to drop into as many mailboxes as it can.
*This is no longer the same game played all those years ago by Cantor and Siegel with their 'green card lottery spam' that started the whole thing rolling. No, the name of the game now is to make money by convinc
ing sleaze merchants and serious businesses alike that by sending out 10 million untargeted mails they'll see a return on their advertising spend if only 0.25 per cent respond. That's a compelling sales pitch and the deal is easy to close when they reveal that the 10 million people can be reached overnight in a single mail-out costing a fistful of dollars.
**Canning the spam
*We need to fight spam on two fronts. Those fronts are education and technology - the first being the most importan
t. Get the message across to users of the spamhaus services that such untargeted, random, bulk mailings are worthless, that they damage brand reputation, alienate recipients and thus discourage potential customers, and - perhaps worst of all - associate the brand with others who spam; namely, pornographers, scammers and con-artists.
*Once they realise exactly what they're becoming a part of, I suspect many will stop. Those new media marketers who understand the true value of demographics
and can target specific market sectors without violating the user's right to privacy will flourish. After all, the commercial Internet is no different from any other marketplace and requires advertising revenues in order to survive. The trick is to ensure that the message doesn't destroy the medium.
*This is where that second defensive front opens up - the use of technology to fight back. The answer to spam is obvious: adverts are no use if they're not read. Banner ad-blocking software ha
s been deployed to great effect, shaping how that particular advertising sector operates. Banners are becoming less intrusive, more user-friendly and less likely to be blocked - or at least the ones that get past my banner blockers are! As far as spam emails go, separating the meat from the messages isn't so easy.
*At a corporate level, if you have the budget, you can use a third-party spam-filtering service such as Brightmail (<A HREF="http://www.brightmail.com" target="_BLANK">www.brigh
tmail.com</A>). A few forward-thinking ISPs such as Nextra/Cix also use this service and make it available to their members. It certainly catches a good 90 per cent of the spam to my Cix email address, but regrettably that isn't the address I use most.
*However, for the average Joe looking for a way to filter spam at the POP3 server level (or even within their own mail client) to avoid wasting hours reading and deleting the stuff manually rather than the bandwidth cost of downloading the
junk, such highly priced services aren't an option.
*There are some clever tricks you can play. For example, I know someone who asks close friends and associates to add specific keywords to their mail subject header. His mail server fast-tracks such mails directly to their client mailboxes as it knows they're not spam - everything else goes through the full spam-filtering process looking for common spamhaus strings in headers as well as body text. Quite cute, and it works well if you're d
isciplined enough, but such measures are overly strict for most people and the risk of losing important mails is always there (say, if one of your associates forgets to play ball on a stressful day).
**Beating the meat
*I've been examining various commercial and community approaches to spam eradication for the last few years, and none has ever proved effective enough to get me hooked. In fact, for most of this time, I've reverted to the simplest of solutions: JDTS - Just Delete The Stuff.
This works but is time-consuming and offers only a hollow and fleeting feeling of victory until your next polling of the mail server.
*I tried to get into SpamCop (<A HREF="http://www.spamcop.net" target="_BLANK">www.spamcop.net</A>) and have been a paid-up member for some time, but it doesn't deal with the spam physically at your end; it just helps you to complain to the right people - the sysadmin for the spamming mail server. However, more often than not, that's just some open relay b
eing exploited, or a fake return address. And even if you do get a good flame through to the host ISP, the spammers will be long gone, as they treat their accounts as disposable - used once for a bulk mailing and then thrown in the trash.
*Ever wondered why so much spam is selling spam services? Because increasingly that's where the money is being generated in the spam business - the sale of those 50 million addresses on CD-ROM for
50 a pop. A spamhaus will send out 25 million email flye
rs, which is a pretty painless procedure these days, and if they get 100 paid responses that's not a bad day's work. They can afford to move around, just like those dodgy geezers selling watches out of suitcases on the high street. As soon as they see the authorities approaching, they shut the case, nip round the corner and start selling again.
**Rinse cycle
*I no longer bother with SpamCop. I know it provides a useful service by monitoring spamhaus headers, but my working day is long eno
ugh already without the time it takes to be a good online citizen. Instead, I've turned my attention to another proactive spam-busting method, which I mentioned briefly a couple of issues back (see issue 101). MailWasher certainly appears to be dampening down my spam dose, and I'm pleased to report that it's working rather well. If you want the best standalone solution that enables a Windows box to intelligently screen incoming mail from your POP3 server before it gets despatched to your m
ail client, look no further.
*MailWasher isn't a fire-and-forget application, as it requires a degree of involvement from you to control what mail is allowed off the POP3 server and into the client. However, I heartily believe that this is a good thing. It's the kind of long-trousered approach that appeals to me, because I get both the satisfaction of hands-on crushing of my junk mail and the knowledge that all information gleaned will become part of a powerful community weapon against th
e spammers.
*The principle is simple enough: you fire up MailWasher and it checks your account, or accounts on multiple servers simultaneously, for messages awaiting download. It then looks at their headers and determines from these and the body text whether the message contains a virus or matches spamming patterns, using a combination of identification processes.
*At its simplest, it applies a blacklist/whitelist approach. If a domain or specific address is found in the specified blackl
ist database - you can configure it to use default databases, specify your own or edit existing entries to suit - the message is flagged for automatic deletion upon hitting the Process button. Similarly, messages from domains or addresses on the whitelist - called the 'friends list' in MailWasher - are marked as friendly and automatically flagged for download into your mail client of choice.
*This approach doesn't work at its best straight from the box, but rather takes time for you to bu
ild black and white lists that address your specific spam loads. In practice, I've found it needs ten to 14 days of input before it starts making the right decision more often than not. Once you've reached the stage where MailWasher is correctly blacklisting all your spam and letting all your confirmed friends through, you might want to think about switching to a more automatic mode. By choosing to just download and not display entries from your friends list, you save on screen estate and
are fairly sure that only acceptable mail is making it through, without further intervention.
*By the same token, you can just as easily set MailWasher so that blacklisted entries get deleted automatically. After three weeks of training here, day in, day out, I'm now confident that it's catching the majority of genuine spam using this method. Be sure you're happy that the software has got to know your way of working and what you classify as spam and what you don't before you start relying
on its automated features to get it right every time.
*Another nice real-world touch is the ability to apply standard mail client-style filtering to those searches (with regular expression support). Not everyone's cup of tea, I have to admit, but it does add another level of usability to this simple yet powerful application. What should be more up the PC Pro reader's street is a facility to use heuristic checking of body content and flag anything that matches the heuristic patterns for s
*The important thing is that MailWasher sucks mail information off the server, intelligently identifies it using some or all of the methods discussed above, and then starts to flag each mail item accordingly.
*If you're polling multiple accounts on multiple servers, this isn't an instantaneous process, especially at the start of the working day when you normally see your biggest spam load. However, neither is it a long, drawn-out job, and my logs over a one-month testing period show
that it took an average of 30 seconds to classify 100 items of mail once their headers had been hoovered off the server.
*The important thing, especially during this initial training phase, which takes as long as you want in order to be comfortable with how accurately items are getting categorised, is that you let MailWasher finish the flagging before you hit the Process Mail button. Do that too soon and you run the risk of losing important mail, and there's no getting it back either, bec
ause once it has been categorised as spam it will be deleted directly off the server never to be seen again.
*It's at this mail-processing stage that I feel MailWasher gets a little too clever for its own clogs - it offers one option too many. You can choose for mail to be downloaded (if non-spam) or deleted as already mentioned, but there's also a third option that sends a 'bounce' message to the sender of the spam in addition to deleting it from your server. 'Hey, that's a great idea' y
ou and many others might well be saying now, but no, believe me it isn't, it's a deeply flawed one. The main problem is that it isn't a real bounce message, and it only gets sent to who MailWasher thinks is the sender of the spam.
*This double whammy of arse-backwardness makes it no better than useless in my opinion. First, there's the fake message business: anyone who knows anything about bounce messages will realise that a genuine one will come from the same server that received the ori
ginating mail, and if you're running multiple mail accounts on separate servers it all goes wrong. More importantly, though, a real bounce message is sent back at the time of receipt by the server, not five minutes or an hour or more later when you're polling it with MailWasher. Quite apart from that, the fact is that many spammers don't care if their message has hit the target or a black hole - it's the mere sending of it that has earned them the money.
*Then there's the question of gett
ing the bounce message back to the proper sender anyway. Hands up if you've had email bounce messages arrive in your mailbox apparently from someone who didn't want to know about getting their breasts enlarged or participate in the chance to earn commission from some Nigerian nobleman looking to launder an unlikely sum of money in your bank account. Do you really want to be adding to the spam problem by sending this kind of stuff to innocent parties whose only crime is to have had their em
ail details used by a spammer?
*But there's no denying that by coupling MailWasher's stage-one filtering approach with your mail client's pre-filtering function (you're using a client that lets you see what's sitting on the server awaiting delivery before it's delivered, aren't you?) you'll see a drop in spam volumes in the region of 95 per cent. Yes, you heard that right. By simply devoting a little of my precious time to proactively doing something about the spam problem as it affects me
, I've seen the amount of junk that gets delivered to my mail client and into my mailboxes drop from 2,000 per week to just 100! Not yet perfect by any means, but certainly worth all my effort, and perhaps yours.
*The latest versions of MailWasher also support Hotmail accounts, which can now be cleared of spam too. You can try it free of charge (no spyware I can find, just a scrolling banner in the application asking you to consider registering to support the developer). If you like it an
d use it, do consider making a donation as asked by the developer. He's not greedy and says you can make any donation you like, but $20 (
13) gets you unlimited support. Check it out for yourself at <A HREF="http://www.mailwasher.net" target="_BLANK">www.mailwasher.net</A>
HKDavey Winder scrubs up to take a look at spam-filtering software MailWasherL
Red Hat produces what's probably the best-known Linux distribution. Although SuSE is arguably more popular in continental Europe, many companies in the UK - including ourselves - still use the Red Hat distribution, and it's by far the most popular among US corporate Linux users. To some extent, our use of Red Hat here at Wide Area (and at W A Communications, our US subsidiary) is a case of 'sticking with what we're used to', but a recent announcement by Red Hat is making us think that perh
called 'cones' - interpret the different wavelengths of light falling on them. The three types of cone are sensitive to three wavelength ranges within the visible spectrum between infrared and ultraviolet, and it's a particular combination of all three responses that produces the sensation of seeing a particular colour.
*The cones' sensitive ranges overlap, so it wouldn't be true to say that they correspond directly to our perception of red (R), green (G) and blue (B). But by mixing diffe
rent combinations of these three 'primaries', we can reproduce the largest possible proportion of the visible spectrum. Within this light-based RGB colour mixing model, absence of light produces the perception of black, while a mixture of all wavelengths is perceived as white (as Newton showed with his prism). Due to the way the primaries are combined, the RGB system is called 'additive'.
*Computer monitors emit light, so the RGB mixing model is tailor-made for them (as it is for desktop s
canners and digital cameras). By varying the brightness of red, green and blue phosphors, the monitor produces a huge range, or 'gamut', of colours. It's convenient to store each primary's contribution as a single byte, permitting 256 (28) possible levels for each of the RGB components, which combined together can describe over 16 million possible colours (256 x 256 x 256), ranging from black (R0, G0, B0) through to white (R255, G255, B255). This beautifully efficient 24-bit RGB system is
what we're all familiar with, as represented by the computer's ubiquitous RGB colour mixer panel.
*Now all we need to do is get these RGB values onto paper, but this is where we hit the fundamental problem. Printed pages don't emit light like a computer screen. On the contrary, their inks absorb light so that a red pigment produces its colour effect by removing all the other wavelengths from the white light reflected from the page. This means ink-based colour mixing is 'subtractive' rathe
r than additive. A mixture of all colours tends toward black, while the absence of colours leaves the white of the paper - the opposite of RGB. However, we can still invent an equivalent ink-based three-colour mixing system with the widest possible range, choosing the 'complement', or opposite, of each RGB primary.
*The ink colours that fill these roles are Cyan (C), which absorbs red light and reflects blue and green; Magenta (M), which absorbs green and reflects blue and red; and Yellow
(Y), which absorbs blue and reflects red and green. The inks are translucent, which means that overprinting two of these subtractive primaries is equivalent to an additive primary; for example, overlaying cyan on yellow leaves a green reflected light. In this way, combinations of the subtractive primaries CMY can be used to produce a printed equivalent of the RGB gamut.
*Unfortunately, the inks are not perfect absorbers or reflectors and so combining maximum levels of each doesn't produce
a pure black, but rather a muddy brown. The solution is to introduce an extra, dedicated black ink (indicated as K rather than B to avoid the clash with Blue). In addition to improving colour reproduction, this enables high-resolution unscreened black type to be output with maximum contrast and hence maximum readability.
*The resulting CMYK colour mixing model - called 'four-colour process' by printers - is the basis of full-colour commercial printing. Its advantage over specially concoc
ted 'spot colour' inks, such as the Pantone range, is that it can specify a huge gamut of colours on the same page. Also, by breaking images down into halftone spots composed of variable quantities of cyan, magenta, yellow and black ink (see issue 79), it can fool your eye into seeing the continuous tones essential for reproducing colour photographs. In other words, you can produce any of those full-colour, photograph-filled layouts that you see in newsstand magazines by decomposing your p
ages into just four 'colour separations', one for each of the four CMYK inks.
*CMYK is a formidably flexible system and its proper handling is the secret of successful commercial print, which is why professional DTP applications and print-oriented vector drawing programs such as InDesign and Illustrator encourage you to specify colours in CMYK rather than RGB. Incidentally, since commercial print predates computers, CMYK colours are usually specified as percentages rather than byte values,
so white is 0C 0M 0Y 0K and a solid green is 100C 0M 100Y 0K.
*But we have another problem: the main advantage of CMYK print is its ability to reproduce the continuous tones necessary for colour photographs. However, as I've said above, photographic colours are actually stored as RGB values. Clearly, there's a need to translate between the two colour spaces, and as they're closely linked - with black and white inverted and each pair of subtractive primaries creating an additive primary a
nd vice versa - such a mapping should be mathematically fairly simple.
*In practical terms, it's easy: in Photoshop, just change the Image | Mode command from RGB to CMYK and all the colour values in your image are automatically translated from 24-bit RGB to 32-bit CMYK. You then need to save your image in a format that supports CMYK - TIFF being the most common - for import into your DTP application.
**Gamut over
*It looks as if it could hardly be made easier - but hang on. Depending o
n the image you selected, you'll probably notice that some of the colours in your new CMYK version look different, sometimes very different. Convert an RGB image of Photoshop's rainbow gradient, for example, and the CMYK version is almost unrecognisable, all its bright saturated colours rendered dull and murky. This is an extreme case of the disappointment every designer will have experienced when seeing a four-colour process print-run turn out a dull travesty of the original on-screen lay
out.
*The typical response to this colour shift is to blame the printers, but the effect is unavoidable for a simple reason: the CMYK colour space is smaller than the RGB colour space - hardly surprising, given that CMYK produces its gamut using reflected light and imperfect inks. Those RGB colours that are unprintable - that is, 'out of gamut' - are the purest and most saturated, such as those in the rainbow gradient. Fortunately, such saturated colours are comparatively rare in nature,
but you can see just how much of the RGB gamut is unprintable using Photoshop's RGB colour mixer, which indicates unprintable colours with an exclamation mark - move the three red, green and blue sliders and you'll find whole swathes of the RGB colour space that are simply out of bounds for process printing.
*There's also another major problem. We've been acting as if there was one fixed CMYK gamut, as if the same percentages of ink will always produce the same colour, but that's simply n
ot the case. The CMYK gamut varies depending on the press and printing conditions, and the translation from RGB needs to take this into account. The most significant factors are the inks and paper used. Different presses in different countries tend to use slightly different setups and inks, and combine them slightly differently to produce different gamuts. Then there's the paper and how it interacts with the inks. Clearly, the same inks printed on uncoated newsprint and on bright white coa
ted paper will produce very different gamuts. And since the perceived strength of each primary is determined by the size of its halftone dot, the absorbency of the paper matters a lot.
*Another complicating factor is the addition of a black plate to the main CMY primaries, which, in an ideal world, would themselves combine to produce a true black. One of the advantages of this black plate is that equal percentages of cyan, magenta and yellow can be shifted onto it, cutting down on the pos
sibility of over-inking and giving subtler control over the shadows and tones in an image. In 'undercolour removal' (UCR), black ink is used to replace equal amounts of CMY only in neutral grey areas of the image, while in 'grey component replacement' (GCR) they're replaced even in coloured areas. Another advantage is that you can produce richer blacks, for example, by translating R0 G0 B0 to produce a 'warm black' with added magenta and yellow or, for a polar image, a 'cool black' with ex
tra cyan.
*In short, there are many, many ways of mapping the same RGB values to CMYK percentages, and producing the best possible results means customising the conversion across the full colour spectrum, taking into account not just the individual press, inks and paper but even the individual image!
*Not surprisingly, it's Photoshop that leads the way here. In version 7, you can achieve this level of control using the Edit | Colour Settings | Custom CMYK... command available from the CMY
K working space drop-down, whose dialog lets you specify ink colours and set GCR, UCR or custom black generation. As you make changes, you can see how the neutral RGB colours from 0 to 255 will be translated to each of the cyan, magenta, yellow and black plates as a graph (notice how each ink's graph is non-linear and different to the others). Once the profile is created and selected, it will determine the final CMYK values when you change image mode.
*The power's all there, and it's usefu
l to see how a CMYK profile controls the conversion process. But don't panic - no-one expects you to use it. Requiring a designer to become a pre-press expert would be crazy, and fine-tuning each individual image's RGB-to-CMYK conversion isn't practical. Thankfully, these days Photoshop, Illustrator and InDesign come with preset profiles for each of the main printing regions - Europe, the US and Japan - with alternatives supplied for coated and uncoated paper and - in the US - for web and
sheet-fed litho presses. Check with your bureau to see which preset it recommends for your job, or whether it has produced its own profiles (some bureaux prefer you to convert to a well-defined proofing colour space that they can set their presses to match).
*One final factor you need to know about and be in control of is how your out-of-gamut colours are handled. This is managed via the Colour Setting dialog's Rendering Intent setting (click on Advanced to see this option). If you're pro
ducing a flat colour logo, for example, and don't mind out-of-gamut colours simply being mapped to the nearest printable colour, you can select Absolute Colorimetric (though it would make more sense to select only colours that work both on screen and paper). If you're producing a business graphic, you might be less concerned with colour accuracy than impact, in which case select the Saturation option. For continuous-tone images like photographs, it's often better to sacrifice exact colour
matching to preserve the full tonal range, in which case choose Perceptual. And finally, for a cross between colour accuracy and maintaining tonal range, you can choose the default Relative Colorimetric option.
**RGB or not RGB
*That's how RGB to CMYK conversion works and how to set it up for good results. You should now see that achieving perfect results is impossible, and even achieving the best possible result is often impractical. And knowing what's involved, you might well prefer to
leave it to your printers, who are the press experts after all. If you have a good relationship, that might be the best option. But be careful: many large printers will simply apply a standard conversion, while others will refuse to accept a job with unconverted images as they know they'll be blamed for the inevitable colour shifts. If you're really unlucky, the printers will simply output the pages and photos as delivered and - depending on your DTP package - your RGB images could end up
as greyscales on the black plate!
*If you do take responsibility for conversion yourself, when should you do it? Some designers argue that if an image is going to be separated for commercial print, you may as well work in CMYK as early as possible. This does bring several advantages: colour correcting in CMYK is much more intuitive than managing RGB, and with a dedicated black plate it's easier to control shadow detail, for example, or to apply sharpening just to the black channel. If you
need to colour match exactly, say to add a tinted box that matches your corporate colours, or if your image was professionally scanned as CMYK, it definitely makes sense to work with CMYK.
*Generally, though, the case for editing in RGB is more convincing. For starters, the RGB gamut is much wider than CMYK and it's always better to retain as much colour information as possible. This is especially the case nowadays when you might well be producing work for screen/web use as well as for com
mercial print. You also get more editing power in RGB mode, because many features such as filters aren't available in CMYK.
*Plus, if you're going to be printing to a non-PostScript colour inkjet, you'll get better results if you stick with RGB. This may seem counter-intuitive, as inkjets clearly build their colours using ink. However, they aren't limited to the standard CMYK process colours and are optimised to work with computers, so they're treated as RGB devices. On top of all these o
ther advantages, 24-bit RGB is 25 per cent more efficient in terms of both space and processing time than 32-bit CMYK.
*The real clincher is that if you're using Photoshop, you can get most of the benefits of CMYK while still working in RGB. Photoshop's Colour palette marks those RGB colours that aren't printable with an exclamation mark, and clicking on that mark automatically converts the colour to the nearest CMYK-friendly colour. The Info palette shows you both RGB values and CMYK perc
entages and highlights problem colours as you move the cursor over the image. You can even work directly with an image's brightness channel by temporarily converting to LAB mode, without the perceptible and irreversible colour shift inherent in converting to CMYK.
*More useful still is Photoshop's soft proofing capability. Using the View | Proof Colors command (shortcut <Ctrl-Y>), you can preview how your image would print using the current CMYK profile, and the New Window command allows
you to compare RGB and CMYK versions side-by-side. Since this preview is non-destructive, you can swap profiles using the Proof Setup | Custom command to see how your image would turn out on, say, coated or uncoated stock.
*Best of all, you're able to see all areas of the image that are unprintable using the View | Gamut Warning command (shortcut <Shift-Ctrl-Y>), which indicates out-of-gamut colours using a grey overlay. The ideal is to keep these to a minimum when colour correcting image
s destined for commercial print (you can toggle the gamut warning when the Adjust dialogs are open). You may also use colour correction to bring these problem areas back into the fold - usually by lightening or desaturating - and Photoshop conveniently lets you select all out-of-gamut colours automatically from the Select | Colour Range dialog's drop-down.
*This gives you the best of both worlds - editing in the RGB gamut, but always taking into account and controlling your image's CMYK gaa
mut, ready for the final conversion, separation and printing. With a colour-managed, profile-based workflow, you can push the conversion even further downstream, but that's another story for a future article.
H[Tom Arah explores the RGB and CMYK colour spaces, and how best to get from one to the otherL
*The announcement, as quoted on the company's website, says: 'Beginning with the 8.0 release, Red Hat will provide errata maintenance for at least 12 months from the date of initial release. At certain times, Red Hat may extend errata maintenance for certain popular releases of the operating system.'
*Twelve months. One year. That means that Red Hat Linux 8, which was released just before the end of last year, will officially be obsolete, as far as Red Hat i
s concerned, on 31 December 2003. The same goes for versions 7.1 to 7.3, while Red Hat 7 and 6.2 will already have been 'End Of Lifed' by the time you read this - the official last support date is 31 March 2003.
*Now, in principle, we agree that people should keep their systems reasonably current, but allowing just one year for bug fixes and security updates on something as important as an operating system is plain wrong in our view. Many organisations, after they've installed an OS, don't
want to go through the pain of a major upgrade unless they really need to. That's the case whether the OS in question is residing on desktop machines or - more significantly - on servers, where any downtime at all becomes a serious problem.
*Of course, all manufacturers have to put a limit on the amount of support they're willing to provide, as no-one can afford to keep supporting software that's ten years old once several later versions have been released. But to drop support for somethi
ng only a year after it was introduced is a bit too eager for comfort. This 12-month lifespan, we should point out, only applies to the 'standard' release of Red Hat Linux. If you fork out for the Advanced Server version, you get up to five years of 'errata maintenance', which is fine, but requires the purchase of an annual subscription ranging from $799-$2,499 (
1,560) per copy.
*We completely understand that Red Hat survives on the revenue provided by support contracts and the like
, but we still have serious problems with it basically telling us that we have to perform a major upgrade at least once a year. Contrast that with Microsoft's approach: it's still supporting Windows NT and has only recently killed off MS-DOS. It's not often Microsoft does something we think the Linux community should emulate, but this is one of those times. And in the meantime, we may well be looking at a different, longer-lived distribution the next time we have to upgrade our Linux serve
**MySQL grows up
*At the time of writing, Ian is creating a new training course for Learning Tree International. Entitled 'MySQL: Database Administration and Web Development', it will cover installing and configuring MySQL, and using PHP to create database-enabled web apps. We've never examined MySQL in much depth in these pages, so with the final release of version 4 due by the time you read this, we thought now would be a good time to talk about what's available in the new version a
nd what's due in upcoming releases.
*There's always a major debate when Unix folk talk about which relational database management system (RDBMS) should be used for a particular project. Like too many other Unix-related subjects, this topic seems to bring out the zealot in people. Oracle users will tell you that theirs is the only sensible choice, while DB2 users will argue that, on the contrary, DB2 is the only way to go; and both will agree, at the very least, that you should never consid
er using an open-source solution.
*We (of course) disagree, as we've been using MySQL for years and have found it to be fast, reliable and capable of doing virtually everything we need. Are you spitting out your espresso and reaching for the keyboard to email us a vitriolic response yet? Well, hold your horses. Give us the benefit of the doubt for a few minutes and we'll try to explain why we think MySQL is so good for many web-based apps and, with the release of version 4 (and upcoming 4.
1 and 5), why we think a lot of companies will start to treat it more seriously in the future for other apps too.
*Let's get one thing straight: we're not suggesting all the world's financial centres should immediately switch from whatever they're currently using to MySQL. However, for specific types of application, MySQL is probably better - and faster - than almost anything else out there. It excels in those situations where there are many read and few write requests to the database tabl
es, which pretty much defines the majority of web apps.
*Yes, systems often do have to write to tables, but the vast majority of accesses are to read values, and in that sort of case, especially where the actual SQL queries are not too complex, MySQL is blazingly fast. Indeed, some benchmark tests suggest it's faster than almost every commercial RDBMS out there, including the heavyweights. Not that you'll see such benchmark figures on the MySQL website (<A HREF="http://www.mysql.com" targ
et="_BLANK">www.mysql.com</A>), as restrictions in the licence agreements for many commercial RDBMS prohibit posting benchmark results without the manufacturers' consent.
*Another big advantage of MySQL is that it's free for most people. In the majority of cases, you can use it under the GNU Public Licence, which allows you to implement it wherever you want. However, there are some situations (such as if you plan to use it as an integral part of your own commercial application) where you h
ave to pay a licence fee. There are also support contracts available, which - along with training - is where MySQL makes its money. While we believe MySQL is easy to install, sufficiently stable and unlikely to need much in the way of support, it doesn't cost that much and helps the company pay for the development of future versions of the software, so you should at least consider taking a contract.
*So what's not to like about MySQL? It's true that if you're using the version of the progr
am that's considered the 'stable release' - as we write it's 3.23.55 - there are some limitations, although probably not as many as you might expect if you haven't looked at it recently. For example, a common belief is that MySQL doesn't support table locking or transactions, both of which are important for apps where data integrity is paramount. In fact, the system has supported both these features for some time now, and transaction processing in particular is well tested and stable.
re are still some limitations, though, in terms of SQL compliance, probably the most significant one for many people being the lack of subselects. True, there are ways around this, especially if you're accessing MySQL via a language such as Perl, PHP or Java, but it's still a deal-breaker for some people. Also missing are views, triggers and stored procedures - again, all things that can be coded around, but which, in an ideal world, should be part of the RDBMS subsystem.
*The good news i
s that version 4 of MySQL, which should be available in a stable form by the time you read this, provides the basis for most, if not all, of those features to be added in the future. Although there aren't many significant changes to SQL support in version 4 itself, the engine has been rewritten to make future additions easier. For example, version 4.1 - in alpha as we write - will support nested queries (subselects), while version 5 should feature stored procedures and maybe triggers and v
iews.
*If you haven't investigated MySQL before - either because you've heard that it's lacking in features or because you'd never heard of it - we highly recommend you check it out. And yes, we do know that PostgreSQL, another open-source RDBMS, supports many of the features that MySQL is lacking. Please don't flay us alive for preferring one over the other.
**I fink, therefore I am
*Finally, we'd like to draw the attention of Mac OS X users to two neat utilities that you might not have
come across, the first of which is an Apple-developed X11 server. The X Window system (X11 is the current version) is the de facto windowing system for Unix, on top of which you run a 'window manager' such as GNOME or KDE that creates the actual look and feel of the graphical interface the user sees.
*When Apple came out with OS X, despite the fact that it uses a Unix subsystem, it had no X server and therefore no X-based programs would run. Several people found workarounds, and at least
a couple of X servers were developed, most notably XDarwin. None of these had the full Mac look-and-feel, although a fork of XDarwin called OroborOSX (<A HREF="http://oroborosx.sourceforge.net" target="_BLANK">oroborosx.sourceforge.net</A>) provides a decent facsimile.
*A few weeks ago, Apple released its own X11 server called (with great imagination) X11 for Mac OS X. There are a couple of advantages to this particular server. First, it exploits OS X's Quartz graphics subsystem, so most n
ew Macs will be able to deliver full hardware acceleration to X11 programs. Its second advantage is that it integrates better with the standard OS X look and feel than any other Unix window manager. For example, you can easily copy and paste between OS X and X11 apps and minimise individual windows to the dock. So if you have a favourite X Window application (such as The GIMP or OpenOffice), you can now run it on your Mac and probably won't even realise that it's not a 'standard' applicati
*The second utility we want to draw your attention to is Fink (<A HREF="http://fink.sourceforge.net" target="_BLANK">fink.sourceforge.net</A>). The Fink project aims to provide all of those lovely Unix utilities out there to Mac users, without the hassle of compiling them yourself. Once you've downloaded the Fink utility itself, you can choose from hundreds of Unix utilities and apps, which will be seamlessly downloaded and installed for you. You don't have to be a Unix expert, or unde
rstand the intricacies of compiling and installing software, as it has all been done for you.
*If, for example, you're interested in MySQL, which we talked about earlier, you can download and install it within five minutes (assuming you have a fast Internet connection). Some developers already provide compiled versions of their utilities that will run on OS X without any extra work, and some can be easily compiled by yourself, but if you don't want to get your hands dirty, Fink is a neat
and useful application.
*And for those of you who still turn up your noses at the Mac: have you actually looked at OS X recently? A few weeks ago, we bought a new PC laptop and installed Windows XP Professional on it. We then spent about half a day installing all the other stuff we wanted - utilities like dig, languages like Perl and so on. We ended up installing the Cygwin package, which provides many Unix utilities on the PC, but it wasn't exactly a fun experience. Mac OS X, on the other
l hand, has all of this stuff available out of the box, because it's Unix to begin with.
*For many Unix developers and hackers, the iBook and PowerBook are rapidly becoming the laptops of choice. Check them out - we don't think you'll be disappointed. (But then, both of us have been using Macs since the days of the Mac Plus, so maybe we're just a little biased.)
HcSimon Brock and Ian Wrigley take Red Hat to task for no longer supporting several releases of LinuxL
May 2003X
103 (May 2003)i
Paul Ockenden and Mark NewtonD
Search me!
fizzledH
fizzy
fjords
flack
flagG
flaggedU
flaggingc
flagrant
flagsI
flagship
flagships
flail|
flailing
flair
flakey
flakier
flakiness
flaky
flamboyant
fluffyI
fluidJ
fluidityJ
flukeY
flushdnsV
flusheds
fly-out
flyersc
flyingT
fo2pixn
focusn
macintosh-formatN
macintoshes
mackiea
maclibel
macmillan
macola
macola's
macos
macroH
madameP
maddening[
madeA
performanceB
storesP
storiesQ
storingB
storing-and-forwardi
storm}
storming
applescriptM
appleshare
appleshare-over-ip
applestores
applet]
applet's
applet/applications
appletalk
appletalk-compatible
appletree
applets
appleworks
applianceY
appliance's
appliancesA
applicability
x-basede
x-co-ordinate
x-eudora-option
x-eyes
x-files
x-friendly
x-height
x-hour
x-no-archive
x-rated
x-windows
x-windows-alike
x/open
x0000
x116.3
x1200
x3ymjayas
xanadu
created
company
called
computer
concepts
which
xara's
xara3d
xaraonline
xarraydb
xaudio
xbox365
xceed
xceedsoft
xclip
xcopy
xcuse
xdarwine
xdefaults
xdesigner
xdm/cde
xdocsT
xdrive
xdsl-connected
xe'sk
xeena
xelfi
xelfi's
xenicalt
xenix
seconde
seconde
second's@
secondcopy@
seconds
secpay
secret@
secrets@
section
sections
sectors@
securebat
securitae@
security
security-conscious
security-wise
seduced
see/edit
seek@
seem@
seemed@
seemingly@
seems
segmented
seizes@
select@
selectability@
selectedanswer@
selecting
selection@
self-confessed@
self-describing@
self-repairing
self-restoring@
sell-by@
sellotape@
semi-detached
semi-perishable
semi-professional@
semicolon@
send/receive
send/rpc
senderrorreport@
sendmail@
sensation
sensationalist@
sense
sensed
sensibilities
Probably the most common question that web agencies will be asked by their clients is, 'How do I make sure my site is at the top of the list in the search engines?' Most of us have our own little hints and tips about how to get onto the front page of AltaVista or near the top of the list on Google.
*I'll tell you about my favourite in a moment, but first let's take a look at how search engines rank sites. To start with, they have to find your site, and it's one of the great Internet myths
that search engines can 'seek out' new sites. Clients worry that their site under development will be discovered by the search engines, so we often have to plaster multiple levels of passwords on top of them to give security more suited to Fort Knox than a simple brand website. In reality, the only way a search engine will discover your site is either because the site is linked to or from another, already-indexed site, or because you've submitted the site yourself using the 'add my site'
facility (or one of the multiple submission services).
*Once a search engine knows about your site, it will index it. This involves sending one or more automated agents, or 'spiders', to work their way through the site, following the links found. You may hear this indexing process referred to as 'spidering'. It's worth noting that you can stop the search engines from indexing certain pages or sections of your site, which you may want to do if they have very limited lifetimes; for example,
a page of today's news. To exclude such pages, simply create a file called robots.txt and place it in the root directory of your site. The format of this file is as follows:
*User-agent:
**Disallow: /scripts/
*Disallow: /news/
*Disallow: /backend/admin/
**The first line specifies which search engines to exclude: an asterisk is a wildcard, meaning that all search engines should obey the following rules, a list of directories that the engines should ignore. You can even specify an individu
al pages, as in: Disallow: /backend/admin/logon.php
**Add comments to your robots.txt file by preceding them with the # character: Disallow: /news/ # No point in indexing these, as they are changed every day.
*Different search engines have different spidering rules. Some like Google trawl through the entire site, while others like AltaVista only follow links to a few levels. Either way, they're interested in text and won't take any notice of pictures, so if you've used rendered text for y
our headlines the search engines won't see it. What's more, you can't even rely on duplicating the text in the Alt tags of such images because, although Google and AltaVista will now see it, the increasingly prevalent Inktomi-based search engines won't.
*Meta tags are now pretty much dead and buried (see Mark Antony on meta tags), and the parts of your page the search engines are really interested are its main content - the text - and its title. The title's particularly important, as some
engines now treat this as the main indicator of what's on the page. Another key to your ranking within the better search engines is the popularity of your site, by which I don't mean your total traffic or the size of your log files, but rather how many other sites link to yours. A good set of incoming links from sites that are themselves popular will push you way up the rankings.
*A useful site for keeping up with the various search engines and their current indexing and ranking rules, is
<A HREF="http://searchenginewatch.com" target="_BLANK">searchenginewatch.com</A>. The articles are very good, although some of the more 'juicy' information is only available to paid subscribers.
*Given that you know the various rules these engines use to rank sites, it should be relatively easy to play tricks on them and cheat your way to the top of the listings. There's a whole industry of companies that will tinker with your site, link to it from loads of other fake sites, repeatedly sub
mit it, submit certain pages as well as the front page, and do all kinds of other things to get you to the top of the results list.
*Of course, the search engines' proprietors don't like this, as it makes their results less useful and annoys users - if someone searches for Henry VIII, they don't want a site trying to sell weight-loss tablets. As a result, the search engines actively seek those sites that try to cheat and apply weighting factors to ensure these sites sink down to the botto
m of the list. And so begins a game of 'cat and mouse', where the site-promotion companies find new ways to cheat, and the search engines discover those cheats and penalise the sites that use them. It's a never-ending cycle.
*Now seems like a good time to tell you about that cracking 'hot tip' I promised earlier, the one guaranteed to get you good search engine rankings. My hot tip is: don't use tips and tricks. Ultimately, you can't fool the search engines, and if you try to cheat your wa
y to the top they'll find you and penalise your site. The only reliable way to get near the top and stay there is to have a popular site with good, strong, relevant content, which is updated on a regular basis. All the search engines are trying to do is show the best sites at the top of their list, so if you want to be there build a 'best site'.
**Free web design
*Sometimes you need some inspiration for a website design, and just the other day Mark stumbled across the Open Source Web Desi
gn site (<A HREF="http://www.oswd.org" target="_BLANK">www.oswd.org</A>), which is full of sample designs. The best bit about these designs is that as they're all open source, so you can, if you want to, take one and use it for your own. The site allows visitors to vote on the designs as well, which encourages improvement in the quality of the submissions. It's a great platform for very talented people - some of whom appear to still be at school - to show what they can do.
**Flash remoti
ng and web services
*Previously, we've covered how to build your own Web Service. This month, Mark has a go at doing something useful with one. For those of you who weren't paying attention last time, a Web Service is a component that sits on your web server and, when queried, returns data in XML form to your web application. Web Services are a key part of Microsoft's .NET technology, although they aren't limited to Microsoft operating systems. The original idea was for Web Services to sit
out on the Internet and be accessed by many other sites either on a paying or free basis.
*This use is seen as secondary by Microsoft, which now prefers to stress Web Services as a core technology for providing data streams to your own line-of-business applications and websites (perhaps also for business-to-business transactions with trusted partners). But the original vision put forward by Mr Gates of a 'great calendar in the sky' that many sites could access has been put on hold. Micro
soft says that having spoken to its customers, they don't want this, but we suspect it's more a problem of security and Denial-of-Service attacks that could render such services more of a liability than a benefit.
*Web Services certainly provide a very useful way of separating the business model from the design environment, but getting the XML data stream into something that looks good on your web page is somewhat harder. Macromedia's Flash has always been popular with designers for its i
ncredible flexibility and ease of animation, and successive versions have brought many enhancements to assist in building user interfaces. The latest version, Flash MX, can now be considered a UI design tool of choice for web applications. However, building a UI is all very well (and the sort of thing that can be left to your designers), but how do you get data in and out of Flash? There are several ways of doing this; from the early 'kludges' to the latest scheme called Flash Remoting, wh
ich we'll be investigating here.
*As with most things Macromedia nowadays, connecting client-side products with server-side stuff is a lot easier if the server side is Macromedia's own ColdFusion. Never being ones to take the easy way out though, we'll show you how to connect a Flash MX application to ASP .NET (but it could be any Web Service). Macromedia provides Flash Remoting as the 'glue' technology for this purpose, and it works by creating a gateway on the web server between the clie
nt-side Flash application and the server-side Web Service.
*Take a look at <A HREF="http://www.macromedia.com/desdev/mx/flashremoting" target="_BLANK">www.macromedia.com/desdev/mx/flashremoting</A>, and from there you can download a trial version of Flash Remoting. After installing this on your web server, take your Flash application and put the following code into the onClipEvent for the relevant frame: #include "NetServices.as"
*Make sure the file netservices.as is where you say it is,
otherwise when you run your Flash movie in development mode you won't see 'running netservices' in the output window. The next bit of code just makes sure this gateway isn't called more than once, which would consume more resources than necessary: if (inited == null) { inited = true;
*Now we need to connect to the Flash Remoting Gateway, which is installed on your web server. Ensure that a web server is configured on the port that the gateway is looking at, and that this port isn't blocke
d by any firewalls. To check this service is running, open the 'service browser' in Flash, which will show you the URL of any gateway services that are running: NetServices.setDefaultGatewayUrl ("http://your_server_IP:8500/flashservices/gateway");
*Next, create a connection object in the usual way: gateway_conn = NetServices.createGatewayConnection();
*The final bit of 'wiring up' is to get a reference to the Web Service, which is normally the WSDL (web service description language) file.
Pointing your browser at this file - either on your own web server or on another on the Internet somewhere - should get you a page of XML, which is a simple check that you're referencing the correct file: MyService = gateway_conn.getService ("http://your_web _server_URL/ webservice.asmx?WSDL", this);}
*If everything is set up correctly, you need to extract the data into your Flash application. In our example, we provide a button that, when clicked, will call the Web Service, which returns
an XML text stream containing 'Hello World' for the Flash application to display in a textbox. All simple stuff, but if you can get this working you have the tricky bit of configuration sorted - the rest is down to your imagination.
*On the subject of getting things to work, Flash has a fairly good debugging system, with breakpoints and watches as well as variable browsers. This doesn't put it on a par with 'proper' development languages, but it's much better than most web-authoring appli
cations. One problem we encountered with Flash's variable browser while testing Web Services is that it doesn't seem to refresh when you want it to. However, help is at hand via the Trace command, which is rather like Alert in JavaScript, MsgBox in VB or Response.write in ASP. These write text or a variable's value to either a user dialog box or - in the case of Flash - to its output window. This is invaluable when testing any Web Service, as often the result of a small error is that the w
hole thing fails to work, but with no error messages generated you're left wondering what's going on. It's always a problem when debugging decoupled solutions like Web Services and, while it's getting easier, there's still some way to go - you'll find yourself resorting to the older and cruder tricks at times.
*So, back to the code. We need a function that can be called when a button is clicked: function go_Clicked () {
*All that's needed here is a call to the function in the Web Service w
e're referencing (called, rather unimaginatively, 'HelloWorld'), so we write: myService.HelloWorld(); }
*The next two functions handle the returned data from the Web Service - these functions have the same name as the called function in the Web Service, but with _Result and _Status appended to them. The first function HelloWorld_Result will be called if the Web Service returns any data, while the second, HelloWorld_Status, will return any errors. Both these functions replace the contents o
f Text_Box with the data returned from the Web Service, not as XML but plain text:
*function HelloWorld_Result(result)
*Text_Box.text = result;
*function HelloWorld_Status(result)
*Text_Box.text = error.description;
*stop();
*This example is simple, but extending it isn't difficult. We found the most difficult part was the initial configuration and getting everything talking to each other. If you want some Web Services to test with, but don't want to write your own, check out
k<A HREF="http://www.xmethods.com" target="_BLANK">www.xmethods.com</A>, which has plenty to try. However, while we were writing this column, the site went down, causing much puzzlement when our code wouldn't work. It was some time before we realised that the target site was down and our code was fine. The lesson is never forget to test for the bleedin' obvious!
HRPaul Ockenden gets spotted by search engines, while Mark Newton discreetly flashesL
May 2003X
103 (May 2003)i
Web Business
Jon HoneyballD
The Office
searchableK
searchassistant
searchedI
searchenginewatchf
searcher
searchers
searchesI
searchingO
searchlight
likelyD
O+O+O+O+O+O+O+O+O+O
+O+O+O+O+z
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O
OO+O+O
O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+%
OO+O+
O+O+O+O+O+O+O+O+O
I1I1I1I1I1I1I1I
OO+O+
O+O+O+O+O+O+O+O+O
I1I1I1I1I1I1I1I
OO+O+
O+O+O+O+O+O+O+O+O+O
O+*{z
+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1
I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O
O+O+O+O
bottom-of-rangeM
bottom-of-the-range
bottom-tier
bottom-up
bottomleft
bottoms
boughtB
boundsd
bovver
bowden
bowed]
bowels
bowieI
bowled
bowling
bowlsJ
cracksY
cradlep
cradle's
cradles
craft[
craftedw
crafts
craftsmen
craigA
cram-hmac-md5
crammedb
ampeda
crampon
cramps
crams
crane
crankA
cranked
cranky
crannies
crapG
crappy
craquelure
crashl
crash-inducing
O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O
OO+O+O
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1$
O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+Oz
+O+O+O+O
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+%
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%U
+O+O+O
com/downloads/2002/t^
com/downloads/defaul
com/downloads/mssgsu
com/downloads/patche
com/downloads/s5-sr1
com/downloads/sample
com/downloads/search
com/downloads/t
com/ea/lpeutil
com/ecdc-win2k
com/email/upgrade
com/emailsecurity
com/en-us/srchasst/s
com/en/eindex
com/enable
com/eng
com/eng/mozilla/3.0
com/epoc
com/errorS
com/excel/enhancemen
com/exchange
com/exchange/add-ins
com/exchange/flash
com/exec/panama/assoI
com/faq
com/files/attopt
com/find
com/firstpage
com/flicks
com/foldoc/index
com/form
com/forum/showth
com/found
com/frame
com/freestyle
com/freeuk
com/freeware/qpop
com/frontpage/resour
om/hotapplepi/airpo
com/html/eng/plbeam
com/html/typelib
com/htmlnav/index
com/ibuyspycs/defaul
com/ic/products/cd/e
com/ical
initialG
initialisation
initialisation/final
initialisations
initialise
initialiseds
O+O+O+O+O+O
OO+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
+O+O+O$
O+O+O+O
O+O+O+O+O+O+O+
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+OO+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+%
O+O+O+O+O+O+O+O+O+O+O+O+O+
+O+O+
+O+O+O+
+O+O+O+O+OO+O+O+O+
O+O+O+O+O
+O+O+O+O+O+O
I+O+O
+O+O+O+
evera
vented
spleen
email
davey
wondered
self-produced
audio
sound
weedy
everex
every
everybody
everyday
everyone
everything
everything
wanted
about
image
resolution
everywhere
evnull
evolution
exactly
examination
findingsl
findsD
firewall
first
fishesM
flash
flashed
flashesf
flex-ibility
folders
footstepN
fraudsters
friendi
fromG
first
tomorrow's
xml-based
repurposi
history
I said that I'd be returning to Office 11 and here, a little later than anticipated, we are. I ought to first point out that I've been running the first beta release, which is remarkably complete and capable, and in spite of a few obvious nasty gotchas it's possible to run it all day and every day on a production desktop without major problems.
**I started using it with the attitude that Office 11 was almost going to be an upgrade too far - too many Wizards, too much handholding, too muc
h gloss and not enough innovative thinking. However, I'll confess that I've now modified this position. I actively like working in 11 and have started to notice things missing when I go back to Office XP or 2000, which is always a sign that a new release has 'taken'.
**But - and this is a big but - I'm still finding it remarkably hard to get that rush of enthusiasm that used to accompany earlier releases of the Office product. It just seems like more of the same and, to be honest, I can d
o my day-to-day work more than adequately using Office 2000. I did try going right back to Office 97 on an older server on my network, but I soon found myself growling loudly at the raft of useful and important features I've come to rely upon over the years that were missing. Admittedly, if I were being nerdy and had the time to spare to code up some routines, most of those missing features could have been replicated using some Office Visual Basic.
**It's quite clear, at least to me, tha
t Office 11 needed to be a forward-looking product and not just the latest in the long sequence, and in that it almost succeeds - only later beta revisions might be able to persuade me from this rather negative position. You see, if all you want is a solid word processor, spreadsheet and so forth, your existing Office licence is more than good enough.
**To really look forward, Office 11 would have to provide lots of new capability that we can leap upon and notice the advantages. Although
I still believe that it will eventually deliver on that score, the key new items remain tightly under wraps at Microsoft. For example, there's still little information on XDocs, the forthcoming XML-based forms and workflow engine. Once you can start to build workflow systems using XDocs, tying in to the more mainstream applications as required, this will be a big step forward. Hands up those who have ever tried to build such things using VBA.
**There's no doubt that the pen support is co
nsiderably better in Office 11 than in previous versions, which is a mixed blessing - it's probably good enough to be adequate for most people most of the time. On the other hand, I'm still far from convinced that the 'ink' data type is properly understood by developers and recognised as the primary way to move things forward. I know some companies are looking at Tablet PC and thinking about using ink in this way. For myself, I keep getting within a few inches of buying a Tablet then backi
ng away again. The problem is I still want both more and less than Tablet PC can currently deliver - more power, screen resolution and battery life; less weight, heat and physical volume.
**On the XML front, I think Microsoft is doing some great work. My fears about Microsoft's Office XML implementation have all come to nought. I thought it would fill up the XML schema with all sorts of private binary data streams, but nothing could be further from the truth. Go into Excel, set up a works
heet with formulae and so forth, and then export it to XML and everything will be laid bare for your inspection. Surprised? I was. It's even smaller and more efficient than the BIFF native Excel file format - a simple little three-cell worksheet weighs in at 13KB in Excel file format, but 4KB in XML.
**To be fair, the XML format is much simpler than the full structured storage solution of .XLS, and as a result you can't do those complex Excel things like having a graph embedded (although
I think this will change over time). Word from inside the Office team indicates that XML will be the primary file format from now on, and it shows in this implementation. Just look at this snippet of code (Code Listing One), which is the core definition of my three-cell, two-value, one-sum worksheet.
**This is a piece of cake to understand. There's a worksheet object called Sheet 1 and this has a row count of 6 and a column count of 2, which makes sense because my values are held in B4,
B5 and B6. You can easily see the way the formula is constructed too: Row4 Cell2, which is B4, contains a Number of value 123; Row5 Cell2, which is B5, contains a Number of value 456; and finally Row6 Cell2, which is B6, contains a Formula of type "=SUM(R[-2]C:R[-1]C)". It also contains a Number of value 579, which is the result of the formula - as you'd expect, the sheet stores the most recently calculated value alongside the formula itself. The XML that follows this is just basic stuff
to do with Sheet 2 and Sheet 3, both of which contain nothing. The document information like Author and Last Saved is all stored in a preamble at the top of the file.
**Is it possible to imagine anything more simple or clear cut than this format? You can load this XML file straight into Internet Explorer if you want, which will intelligently lay out the XML for you, or you can load it into any other XML tool you like and it will just work. There's no doubt now that Microsoft will move to
an all-XML future for Office documents, and Office 11 is the starting point. The XML stuff in Office XP can now be seen for what it always was - an irrelevant distraction - and this is the real version to be working with.
**Office 11 on Windows 9x
*I doubt it will come as any surprise if I tell you that Office 11 won't work on any OS version prior to Windows 2000, so any Windows 95/98/ME or NT 4 installations that you have are now officially frozen out in the cold. It's hard to blame Mic
rosoft too much. If it wants to build upon the underlying APIs, then at some point it must create some sort of consistent base.
**I can't imagine there will be many people trundling along with Pentium II/233 computers under Windows ME, who will stump up the upgrade money to buy into Office 11. These are users who are either squeezing the last remaining life out of their relatively ancient hardware and software investment - and will eventually do a clean-sweep upgrade to something more re
cent - or belong to that identifiable group of people who are perfectly happy with what they've got and have no intention of moving.
**A couple of weeks ago, I was asked by a friend to try to sort out a printer driver problem at a local firm. I dropped by one afternoon and discovered an office stuffed with computers running WordPerfect for DOS. My eyes popped out on stalks. The head of the firm was adamant that WP/DOS did everything they needed, and that he didn't mind buying machines wi
th 'this Windows stuff' on it, provided he could get WP up and running in a full-screen DOS session. He meant Windows 95 and 98, which had been passed forward from some old hardware during the late 1990s. I never expected to see an office working on Pentium 4 machines and WordPerfect for DOS, but there you go. The boss had no objection to paying money to go to Office for Windows if it offered him a proven business benefit, but as far as he could see there was none to be gained.
**Java bu
ndling
*Just before the world shut down for Christmas, the news was that a judge in the US has decided that Microsoft needs to bundle Sun's version of Java with the Windows platform. We all heaved a huge sigh of relief, as this Microsoft Java vs Sun Java nonsense has been going on far too long.
**Sun claims that having the old Microsoft JVM installed on the Windows platform pollutes its chances of getting a good JVM out into general use. I think it's right - many people use the Microsoft
JVM because it's there, but this stifles the uptake of the full JVM by 'those in the know' and by corporate sysadmins who can enforce software policy on their user base. If you're doing serious Java work, you'll need the Sun run-time or an equivalent - the Microsoft one doesn't cut the mustard.
**I've asked Microsoft why it doesn't just ship the Sun JVM, and have received a variety of responses - sometimes they wring their hands and say 'but there's nothing wrong with ours', which is p
atently foolish nonsense, while others claim that Microsoft's legal agreement with Sun meant that it had to ship its own JVM under tightly controlled limits, and it would all be fine if Sun stopped jumping around changing its mind all the time. The truth, as always, probably lies somewhere in between.
**One thing is crystal clear: users weren't and aren't being best served by Microsoft's actions, and shipping a proper JVM is in everyone's benefit. So what if Sun's JVM is a competitor to
the .NET run-time? Competition is a good thing, and if this makes Microsoft try harder than ever before with Visual Studio .NET I'm all for it. The verdict has cleared the air about the whole JVM issue - developers and users can now decide which is best for them without corporate warmongering polluting the discussion. Plus, there may even be a strange upside for Microsoft in this.
**You might remember that some six to eight months ago, I predicted that Microsoft would buy Macromedia to g
et hold of its stunning Flash MX technology. Microsoft has nothing in the 'Universal Canvas' arena and there's no real sign of work being done there either. The firm seems to have just given up on client-side programming, which is an astonishing state of affairs. As I suggested all those months ago, too many senior Microsoft people seemed rather well versed in what Macromedia was up to, and it piqued my interest.
**One of the key components in Macromedia's toolset is the middleware engin
e built on the J2EE Java platform: wouldn't it be ironic if Microsoft privately welcomed this clearing of the air over the Java JVM issue, thus 'allowing' the purchase of Macromedia and enabling it to expansively claim that it fully supported both Java and .NET run-time? Obviously, the rest of the world would want some sort of reassurance that the other Macromedia tools won't be allowed to wither on the vine. Like many people, I'd be deeply displeased to see the excellent Mac development
and run-time tools sidelined by some nefarious actions of Microsoft. Only time will tell.
**FireWire hard disks
*Here's an excellent tool for you to keep among your range of hammers. Trundle over to <a href="http://www.wiebetech.com" target="_blank">www.wiebetech.com</a> and look at the DriveDock products. These consist of a small interface box that you attach to the rear of an IDE hard drive, connecting to its power and data lines. Internally, it interfaces to FireWire, so you have an im
mediate facility to take a hard disk, pop it onto the DriveDock and mount it straight onto any desktop or server that has FireWire.
**Which operating systems and hard disk formats interoperate depends on the computers you have available and the format on the disk; for example, a FAT32 hard disk works just fine on a Windows 2000/XP or Mac OS X computer. However, an NTFS partition will only be read by 2000/XP, and a Mac File System partition will work only on a Mac OS X computer. But these
limitations are obvious. What's great about these interfaces is that they're designed for you to take a bare hard disk and slap it into a test computer within a matter of seconds. No fiddly IDE cables to be connected, no BIOS settings to be tweaked and, most important of all, no reboot required. Pop the drive onto the DriveDock and plug in the FireWire cable.
**For someone like me who has to deal with a range of disks for Ghost purposes and for forensic use too, these will come as a god
send. I've ordered one of the AC-powered DriveDocks and one of the smaller devices aimed at 2.5in laptop hard disks, and I'll report back when these arrive from the US. Since Windows 2000 and XP support FireWire in the standard OS and an interface card costs peanuts, do you have any excuse for not having FireWire?
**Mac emulators
*Just a final word on the performance of Connectix's Virtual PC emulator for Macintosh OS X. With the release of 10.2.3, there has been a considerable speed-up i
n the core operating system, and nowhere can you see this better than when running Virtual PC. My battered, tired, but oh-so-stylish Macintosh Titanium PowerBook laptop may be nearly two years old, it may have a rather feeble 500MHz G4 processor, but it can now run Windows 2000 Professional in an emulator window at a perfectly usable speed.
**I was going to buy a Tablet PC, but I'll confess that the thought of a new 1GHz 'MacTit' armed with 1GB of RAM, the excellent built-in DVD writer a
nd its gorgeous widescreen TFT panel might just push my money towards Apple yet again. Its speed running Windows 2000 Professional, XP or .NET Server would be more than good enough, and it would be far more fun than yet another boring grey laptop booting a single OS.
HiAfter a little poking around, Jon Honeyball gets used to the features in Microsoft's new kid on the blockL
April 2003X
102 (April 2003)i
Advanced Windows
Simon JonesD
XML for Office 11
Details have been emerging over the past month as to what Microsoft has in store for developers using Office 11. All Office applications will still use the Visual Basic for Applications language and development environment, but Word and Excel will have new project types in Visual Studio .NET, which come under the name of 'Visual Studio Tools for Office'.
**The Office 11 object models are to be extended to give support for the new XML features in Word and Excel. Both developers and end u
sers will be able to create documents where parts of the content are tagged according to an XML Schema, giving both structured documents and the ability to pull data out of such documents using standard XML tools.
**Previously, this kind of task would need to have been performed by automating Word and finding the required data sections using bookmarks. This process can be a little unreliable, as users often unintentionally delete bookmarks or mis-position the data so that it doesn't lie w
ithin the bookmarks. It's usually safer to have template documents made by a competent programmer than rely on end users to get it right every time.
**These Word XML documents could be based on a user-defined XML Schema or on an industry-standard one. You can think of XML Schemas as database definitions that tell the application what data should be captured or stored in any given situation. For example, your company might define an XML Schema for a sales order that would include details o
f the customer; the shipping address; billing address; the date and order number and details of what goods were to be shipped; part numbers; quantities; and maybe other sorts of information too.
**Within your company, you'd use this XML Schema to move sales orders between various computer systems. And later, if one of your customers wanted to send you purchase orders electronically, they could output a purchase order from their computer system as an XML document that conformed to their o
wn purchase order XML Schema and electronically transform this into your sales order schema. In order for them to do this, you just need to send them your sales order schema definition. Over the past few years, various industries have been busily defining XML Schemas that can be used by anyone in that industry. The insurance and banking sectors have led the way, but there are now schemas published for the building, farming and legal industries among many others. Using these industry-standa
rd schemas can greatly simplify electronic data interchange both between and within companies.
**Until now, the use of XML has been restricted to big database systems and companies large enough to employ IT staff with the necessary skills, but Office 11 could change this situation radically. It brings tools for creating XML documents to millions of office workers and to the IT professionals who support them in companies big and small.
**Word 11's Object Model contains new objects and co
llections for XMLNamespaces, XMLNodes, XMLSchema References, XMLTransforms and XMLChildNodeSuggestions. You can now mark up text in a Word document with XML tags and these tags may then be formatted according to XSL style sheets. Word 11 includes a new display mode that shows the XML tags for editing, but these tags can just as easily be hidden from the end user. You may also include XML data in a Word document, specifying how it should appear through an XSL Transform file. This is done ei
ther through the Word user interface or programmatically, but in either case it creates a new INCLUDETEXT field that can be set to update manually or whenever the document is opened.
**Excel 11 has new XML features that enable both end users and programmers to map data from an XML file to cells in an Excel workbook. End users get a simple drag-and-drop interface from an XML Source Task Pane, while for programmers the Excel 11 Object Model includes new objects and collections for XMLNamesp
aces, XMLSchemas, XMLMaps and XMLDataBinding. Linking an Excel workbook to an XML Schema means that the data in the workbook is automatically validated against the schema so that data can be gathered from users in the correct format for consolidation/ transmission to other systems that use the same schema.
**Smart Documents
*The Smart Documents feature builds on the XML Schemas facility by allowing developers to program their own Task Panes that react to where the user is in any XML docum
ent. As the user moves round within the document, the Task Pane sees which XML Node is current and can then show appropriate help, related tasks, links or supporting data, as decided by the developer. Each Smart Document is marked with the location of the required 'solution files', which can include any .NET or COM DLLs that are needed by such a custom Task Pane. If these files are held on a trusted server, they're downloaded and installed onto the user's PC automatically, and these files
also get updated automatically if the developer releases new versions.
**This means Smart Document solutions can be deployed merely by publishing the supporting files to a trusted server and then creating a Word document or Excel workbook that uses those files. Anyone who opens such a document or workbook, whether from an email message, file system folder or any other place, will see the new Task Pane (provided their computer can see and trusts the server holding the supporting files). T
his kind of solution is perfect for use within a single organisation where there are established trust relationships between servers and workstations, though it isn't designed for deployment across the Internet.
**I'm still seeing far too many companies where a secretary designs a form in Word, prints it out, photocopies it and gives out the resulting reams of paper to the other people in the organisation for them to complete by hand. The data on the completed forms then has to be typed b
ack into one or more computer systems, often generating several queries a day over illegible handwriting. At the end of the month, someone will order a summary of the data from the computer systems and this will arrive as several paper printouts. The final indignity is that some of the data from these printouts will be retyped into an Excel spreadsheet that will be printed out and put on the boss' desk. There are much better ways of doing all this (and much more).
**By putting Smart Docu
ments and XML Tags into Word and Excel, Microsoft hopes it will make it easier for organisations of any size to capture data just once, at source, and to then to consolidate or repurpose that data easily. Yes, it's yet another set of tools and techniques for you and me to learn, but I think this time I'm looking forward to it.
**Smart Tags
*Smart Tags were introduced with Office XP (Office 10). They're the annoying purple underlines and floating buttons that 'recognise' words you've typed
as being names, places, dates, stock market symbols and so on and offer to allow you to make an appointment with a person, find a map of an address, set up a meeting on a date or whatever. If you find these things useful, please, please write and let me know why, as I've yet to encounter anyone who wants to know anything other than how to turn the darned things off.
**[Aside: Go to the main menu in Word, choose Tools | AutoCorrect Options... | Smart Tags (tab). Untick the 'Label text wi
th Smart Tags' and the Show Smart Tag Actions buttons checkboxes. If you want finer control, you can untick the individual recognisers instead. Click the Save Options... button and untick the options for 'Embed Smart Tags' and 'Save smart tags as XML properties in Web pages'. Click OK twice. Save, close and re-open a document and you shouldn't be pestered by Smart Tags again.]
**Microsoft has previously published a Smart Tags toolkit that allows you to write your own recognisers and acti
ons, which will hopefully be ones that are less annoying than Microsoft's. Now, with Office 11, we get 'Smart Tags Version 2', and Microsoft boasts that these offer 'more intelligent recognition and richer actions'. To my mind, this simply means that Smart Tags will be even more annoying, as they'll be able to act immediately whenever they think they recognise what you've typed rather than just tagging the text and waiting for you to press their button and select an action. They'll also be
able to directly modify the current document, formatting the text, adding extra text, or links to other documents or web pages.
**If you're determined to write your own Smart Tag recogniser, please make it very selective, perhaps recognising only your company's part numbers or product names. For example, I turned off all Smart Tags on my PC because I got fed up of having Word offer to draw me a map every time I type the word 'Enterprise'. There are seven towns and cities named 'Enterpri
se' in the US and one in Canada, and I don't want a map of any of them.
**Access 11
*Access 11 has XML support that's more 'standard' than in previous versions, and developers can use their own XSL Transforms to import data to or export data from Access. If imported data has an accompanying XSD Schema definition, Access will use that schema and recognise defined data types, but it isn't mandatory to have a schema definition before you can import data. Exporting data can now include relate
d tables automatically. These features are available to programmers as well as end users via a new TransformXML method and enhancements to the existing ExportXML method.
**For some reason, Access 11 has also been 'blessed' with Smart Tags. Microsoft thinks that when a developer puts a 'Full Name' field on a form, they may want to mark it with a Smart Tag that will put a purple underline on the data and show a floating/pop-up button that will offer to send an email to the person named. I
can't see any good reason for this - if the developer wanted the users to be able to email the person named, they could just put an ordinary command button on the form.
**SharePoint technology
*In Office 11, Access, Outlook and Excel can link to, import, publish or edit lists of data help in SharePoint Team Services (SPTS). SPTS lists could be events, ideas, contacts or other information - they're lists of items that are shared among the team of people who use the SPTS portal. You can cr
eate a list in an Excel workbook and then publish it to your team's SPTS website. Excel programmers may use the new ListObjects, ListRows, ListColumns, ListDataObject and ListFormat collections and objects to create and query such lists.
**There's more support for SPTS in the SharedWorkspace set of objects and collections, which allow programmers to save and load documents to and from SPTS websites. These features are also built into new Task Panes so that end users can work with documen
ts in SharePoint without having to leave Word and start Internet Explorer.
**Visual Studio tools for Office
*While VBA is still included in virtually all Office 11 applications, developers for Word and Excel will now have the option of using new project types in Visual Studio .NET. There are VS .NET project templates for Excel workbooks and for Word documents and templates (DOC and DOT files). These tools give developers the opportunity to use C# or VB .NET to produce robust, secure, mana
ged .NET code that runs behind Office documents. ADO .NET, the database library in VS .NET, transports and persists data using XML so it's a good match with the new XML features in Word and Excel. The programmer gets all the advantages of the VS .NET environment - improved Intellisense abilities, a better code editor, more help and so on - but also the disadvantages. For instance, VS .NET doesn't allow a programmer to edit the code while it's running, which has been standard in VBA since i
t was invented.
**Using VS .NET permits 'no touch' deployment of projects as with Smart Documents. That is, it doesn't matter how the user receives the document, the code behind it will be downloaded and run automatically providing that user and network security settings allow it to.
**Outlook/Notes connector
*Sneaking it out in the busy days just before Christmas, Microsoft announced a new 'connector' to allow people to use Outlook as a front end to Lotus Domino servers in place of Lot
us Notes. The connector only works with Outlook 2002 (Office XP) and Domino 5 (5.0.1, 5.0.6, 5.0.8, 5.0.9, 5.0.10 or 5.0.11). It supports common functions such as email, calendaring, scheduling, address book and to-do task items, but it doesn't permit access to any Notes databases or applications, nor allow you to have both an Exchange Server and Domino server in the same profile. You must have the Lotus Notes client already installed and configured on a PC before you can install the Micro
soft Outlook connector.
**This sounds like good news for companies with Notes/Domino or a mixture of Domino and Exchange servers but want to standardise on Office XP/Outlook 2002 for the clients. The lack of support for Notes Databases/Applications will, however, rule it out for serious Notes users. Anyone with a valid licence for Outlook 2002 (Office XP) can download and use the connector, but it's much more of an IT department's job to install and configure than something an end user sh
ould consider tackling. If you want it, talk to your IT support people. They'll find details of the Notes/Domino connector at <a href="http://www.microsoft.com/office/ork/xp/journ/outxpcon.htm" target="_blank">www.microsoft.com</a>
H_In the grand schema things, Simon Jones seems impressed with the additions to the updated suiteL
April 2003X
102 (April 2003)i
Applications
Jon Honeyball / David MossD
Farewell Fred
So it's farewell to Fred, who has served me long and true: he never whined or complained, but like any good servant just got on with the work in hand in tireless fashion. He wasn't a particularly flash kind of guy and was more than a bit slow and crumbly around the edges by the standards of today's hot-shots, but he still did sterling service. Until this morning that is, when a press on the power button resulted in Windows NT 4 telling me that important parts of the kernel were missing and
**Fred.woodleyside.co.uk was a Pentium II server running at a leisurely 266MHz. Fitted with 128MB of RAM and a 6.4GB Quantum hard disk, he hummed along in the background acting as Backup Domain Controller for my old Alpha-based NT 4 PDC. His operating system was the slightly weird NT 4 Terminal Server Edition, but carefully patched. The only reason Fred hadn't been put out to grass much sooner was that he had an irreplaceable installation of SessionWall 3 (SW3)
. SW3 is one of those programs that's almost magical in what it does, but demands a financial investment in the licence that can make grown men weep. Indeed, SW3 has itself been put out to pasture as a product, swallowed up and merged into some giant security suite from Computer Associates.
**The combination of Fred and SW3 was unbeatable. Whenever I saw a level of traffic through my Cisco router that didn't seem quite right, a quick glance at SW3 running in a Terminal Server window from
Fred would immediately reveal the source IP address of the conversation, the destination IP address and a complete decoding of the traffic. Whether it was web HTML traffic, SMTP or POP3 or a Telnet session, SW3 would cheerfully decode the byte stream into something visible on screen. In moments of extreme laziness, I was even known to read my incoming email via SW3, as it pieced together text out of the SMTP data stream.
**Obviously, for SW3 to see all the traffic that flowed in and out
of my network, Fred had to be connected to a hub alongside the SonicWALL firewall. The hub was important, as a switch would have intelligently filtered out all the traffic, of course.
**Now that Fred's gone to the great computing store in the sky, I needed something else to monitor my incoming and outgoing Internet traffic, and since I didn't want to spend much money - certainly not thousands of pounds - it had to be readily available. My first step was to consider how to collect the tr
affic data: SW3 uses a network card in promiscuous mode, which sucks up all the passing Ethernet packets, irrespective of whether they're destined for that computer or not. A cleaner solution would be to use the firewall itself to send logs of all connections and transfers, which is easy to do on a decent firewall like a SonicWALL - syslog is the facility to look for. Basically, syslog sends a stream of conversation information to a recipient storage server, which can then do the analysis.
Just drop the IP address of the syslog recipient server into the syslog entry on the SonicWALL and the stream of data starts up.
**Obviously, I needed something at the other end to receive this syslog stream from the firewall. A Windows 2000 server service would be ideal, because it could run all the time in the background, and a nice one is available from Kiwi at www.kiwisyslog.com. I dropped this program onto a Windows 2000 target server and fired it up, logging all the syslog informat
ion to a local file. Finally, I needed some sort of analysis package to read this log file and report back what was happening. In the end, I opted for RnR's ReportGen for SonicWALL.
**As a solution, it works well - I can keep my eye on the Cisco LEDs then go to the Kiwi tools for an immediate 'eyeball' view of the traffic, or to ReportGen to see a more comprehensive analysis. Of all these components, ReportGen is the weakest link and I'd be happier with something stronger. SonicWALL does
an analysis package, which I believe also works on the basis of streaming syslog information from the firewall out to the analysis engine.
**The moral of the story is simple - there's a lot of information available, even from a humble firewall, and you can slice and dice it to your heart's content. Just log it to one central place and use the tools of your choice. I might even write an analysis engine in Excel: it would work well in a data PivotTable, methinks.
**Server disaster recove
*Just before Christmas, I was called to an office where the server had met a sticky end. It was a fairly new Compaq, so there was no problem obtaining an identical replacement almost overnight. I popped in the Windows 2000 Server CD and did a build of the OS to check that everything was working adequately and noticed no problems. I then took the backup tape and dropped it into the DAT drive and started the recovery process to bring all of the C drive back in from tape onto the hard disk
. This went swimmingly until the end, at which point I received a heap of error messages regarding the Windows file protection system. Unperturbed, I pressed on and got to the point where the inevitable reboot was required, whereupon the system failed to start, citing a corrupt kernel file as the reason. I went back to the boot CD and did the basic reinstall all over again.
**At this point, I noticed that we had two CDs - one was original Windows 2000 Server and the other was a bootable C
D containing Windows 2000 with SP 3 applied. Unfortunately, the version that had been on this server, and thus on the DAT tape, was SP 2. I got the familiar feeling of being in a maze of twisty passages, all alike. Rather than dive into the deep and meaningful reasons why this was happening, I decided to blast in a new Active Directory install on the virgin SP 3 installation. There were only four accounts to worry about and getting them up and running was the priority, sorting out email be
ing a secondary issue.
**When I came to Exchange Server, I noticed that all the desktop machines were running Outlook. There were no worries there, except that all these machines had their own local storage of all the email messages. At this point, it dawned on me that whoever had previously fiddled with this system had set up Exchange Server as the world's most expensive POP3/SMTP mailbox, whereby nothing was kept in the server itself. As soon as the Outlook client connected to the serv
er, it sucked the inbox down to the local store and emptied the server mailbox. Performing disaster recovery on this server was therefore a waste of time, as there was no data to recover - it was all on the desktop machines (where, of course, it hadn't been backed up at all).
**Did I mention that the internal domain name was a '.local' too, showing real depth of DNS understanding? It also transpired that the NT 4 desktop machines weren't part of an NT 4/Windows 2000 domain, but instead pa
rt of a workgroup that happened to have the same name as the server domain. Printing was direct to the IP address of the laser printer from each desktop, and printer queues weren't used. Yes, it was a truly horrible mess. Fortunately, it's now in the process of being migrated to a proper installation with accounts, domains and a working Exchange Server installation. However, I do wonder what some people have been smoking when they pop that CD disc into the server and run the setup programs
**XDocs and OneNote
*XML is an open standard. It's text that represents data and, although it couldn't give a damn what operating system you use, the Windows platform will take it mainstream. Microsoft will achieve this through Office 11, the current name for the next generation of Office, by making XML the file format of choice for all the regular Office applications and via a brand-new application called XDocs.
**Complex XML Schemas are still going to require techies to create them,
but simple schemas will be easily achievable by anyone, and you'll be able to work with XSD files without needing a degree in computer science.
**Let me illustrate this by offering a scenario: you're the IT manager who has to find the right person for a job, and you've had hundreds of applicants. You need to come up with a shortlist of people you want to interview, so you start off by handing the r
s out to four of your staff and having them run an initial assessment based on the in
formation contained in each r
. Your HR bods have determined the best fit and your IT staffers have written a simple schema with some detail on who read the r
, the contact details for each applicant and their degree of 'fit' for the job defined as a simple integer value. You now want to quickly create a table that will let you see who is best for the job.
**Fire up Excel and select the first of the four files that have been prepared for you. Nothing clever was done here other tha
n saving the files in XML format rather than the older XLS. You open the file and are presented with an Open XML dialog that offers you three choices: open the file as an XML list, a read-only workbook, or have the XML Structure task pane display the schema fields to you on the right-hand side of the workbook. Choose the latter option and the schema appears in the regular task pane location. Look at the available fields, select the first and then use the <Ctrl-Click> key combination to sel
ect some others and drag and drop them onto your Excel worksheet.
**Now close the editing view, hit Refresh and you're presented with the data from the first document under each of the chosen headings on your worksheet. Sort on Fit and then use <Ctrl-Click> again to select the First name and Fit columns. Click on the Chart Wizard button and on the Finish button, as you know the default chart type is fine for what you want. Instantly, you see who fits your criteria, but you know you have
three more data files to view.
**Head to the Data menu, select XML and Data Range Properties, and then choose the option to append the new data to the existing data, import the three remaining files and do your sort on Fit again. All finished. Information, as they used to say, at your fingertips.
**You chose Excel for this task because it was the best tool for the job, giving you list and charting capabilities. What your staff members didn't know is that you'd already checked all the r
s yourself and had your own opinion on who was the best fit for the job. You'd given them the task to see how they measured up - you have one or two promotions in mind and wanted to know who was best for the job, and this was just one of the tests you'd devised. So now you want to fill out an evaluation form on each of them. You reach resignedly across the desk to the sheaf of forms sat in a tray... and then drop the whole lot into the wastepaper basket.
**Smiling, you turn back to y
our PC, fire up Microsoft XDocs and open a form you'd prepared earlier. You start to fill the form in and soon have all the details for the first of your staff entered in the relevant fields. Do you now save the form and start to fill out another? No, because that would be the electronic equivalent of filling in the inflexible paper form. This time you go back to the first section you want to duplicate, click on one item and start to fill in the form again. You do the same thing on each se
ction as you fill it in for each candidate. The comments you write on each candidate are different in length, but that doesn't matter, because the field has been resized to reflect the amount of text in it, meaning you can print the form straight away and know that all the data will be visible.
**XDocs is a hybrid application, providing forms capabilities linked to all the word-processing capabilities with which you're familiar, which, of course, includes invaluable items like spell-check
ing. It's a proprietary format in that if you want someone else to be able to share your forms, they have to have their own copy of XDocs. However, I don't see that being a disadvantage for long, and XDocs will undoubtedly be available as a standalone application as well as a component of Office 11 when it ships sometime this summer.
**You'll want a copy of it, just as you'll want a copy of OneNote, another new Microsoft application due to ship at the same time and presumably in standalon
e form as well as bundled with Office 11. OneNote is the electronic note-taking companion you always wanted, but never quite had. You can sit in a meeting taking notes in OneNote while simultaneously recording the meeting as a WAV file onto your notebook or Tablet PC. Have you ever sat and stared at a bunch of notes after you've taken them and tried to remember what was being said at the time that caused you to make the note? Now you can simply click on the note and on the Play button, and
OneNote will play back what was being said from just before you wrote the note for as long as you might wish afterwards.
**Sadly, I don't have a copy of OneNote, so I wrote this column the old-fashioned way based on notes taken at yesterday's Microsoft press conference, at which Jean Paoli - one of the creators of XML (and SGML) and now a Redmond 'XML architect' - briefed us on all this stuff. A day later, it's still fresh, but in several months my memory of it will have faded. If you're
a student taking notes at a lecture using OneNote, not only can you refer to them at any time, but also listen to the dulcet tones of the lecturer while doing it. Or if you have a meeting with the boss from hell who adopts the approach that everything should be said at nine million miles an hour - and woe betide anyone who doesn't get it all first time - you now won't care, as you have OneNote. Your system crashed while you were taking notes and now they're lost to the ether? Not any more
- or not unless your hard disk exploded too - because OneNote writes your data to disk as you add it (and that could be either typed or ink data). What's more, if you search for something, OneNote reads your inked words as well as your typed words and highlights them.
HUJon Honeyball puts an old friend out to grass, while David Moss gets excited by XDocsL
April 2003X
102 (April 2003)i
Back Office
Paul LynchD
iPod updates
popsG
popularR
popular/usenet
popularise
popularised
postedZ
postero
postgresR
postgresqle
postingP
postingsP
postsS
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O
structuredI
structuresE
structuringG
struggleI
struggled
struggles
strugglingP
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+
O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
Small changes can have large and unanticipated consequences. This is one of those universal truths that gradually work their way into our consciousness as the result of countless unfoldings of the rule. As a sometime programmer, I know very well to consider all the implications of what might seem to me an inconsequential update. Sometimes I get lucky and a small change makes the users of some application much happier, but more often it doesn't.
**So it is with the iPod. I'm a big fan of
the iPod as a music player, but a minor change to its menu structure that pushed the Track Search function one menu-level down almost made me stop using it.
**In that same column, I talked of the trouble I was having with my iPod losing charge overnight. When first bought, it would happily play the full claimed ten hours and I could even leave it for a couple of days without recharging and still get almost full playback time. As the iPod aged, this time reduced gradually, which is an exp
ected result given that battery technology is less than perfect. However, after installing the iPod 1.2 software update, I found myself getting on a train in the morning with a dead iPod in my hands unless I'd charged it up the night before.
**At times like that, it isn't possible to blame it solely on the software update. It's just as likely that age was to blame for the problem, and that my change of usage pattern with the new software made it more obvious. However, I'm not the only one
to notice this rapid power loss when switched off, and I'm inclined to attribute it to the 1.2 timer software. To solve the problem, just revert to the 1.1 firmware (download the software update for that version and install it). If you don't like that, there's a partial cure that keeps the 1.2 firmware. After connecting to a Mac (which resets the internal clock), try resetting the iPod immediately after ejecting from the machine - which is when the clock/timer is set - but before disconne
cting it, by holding down both Menu and Play buttons simultaneously.
**Part of the mystery is patch version 1.2.1, which has a cryptic description on the software download site for iPod <a href="http://www.apple.com/ipod/download" target="_blank">www. apple.com/ipod/download</a> (this isn't linked from the main <a href="http://www.apple.com/ipod" target="_blank">www.apple.com/ipod</a> web page) saying: 'iPod Software 1.2.1 fixes a problem with the Battery icon. It now correctly indicates
a full charge.' And that's all. A mystery, but one that required a full patch release from Apple. It's possible that this patch also fixes the battery charge problem itself, but I haven't had it installed long enough to be certain (first impressions are good).
**Just for the record, the previous iPod patch, 1.2, which re-organised the menu so inconveniently for me, did also add some beneficial features that I skipped over, apart from the Calendar and Clock features. The main one I like i
s a message on the iPod screen that gives extra information about connection status and tells you when it's safe to disconnect it from its umbilical FireWire cable.
**If this isn't enough, there's another iPod software update, 1.2.2 (<a href="http://docs.info.apple.com/article.html?artnum=120123" target="_blank">http://docs.info.apple.com /article.html?artnum=120123</a>), which is European only. Don't bother downloading it, however, as it's there solely to reduce the sound volume of the
iPod to meet (unspecified) European standards, presumably safety standards, but the patch notice doesn't say that.
**If you feel your battery life has reduced and your iPod is still within its warranty period, Apple will swap out your iPod for a new one, as the battery isn't a user-replaceable component. If, like me, you feel adventurous enough to want to see what's going on inside and try to replace your battery, take a look at <a href="http://home.no/deep/ipod/disassembly " target="_bla
nk">http://home.no/ deep/ipod/disassembly </a>
*I mentioned in an earlier column that iCal integration with iPod didn't support the well-known vCal format (despite the coincidental similarity of names). The latest release of iCal for Mac OS X, 1.0.2, improves the partial vCal support that was in the previous version. Tied in with the new version of iCal - which requires Mac OS X 10.2.2 - is the release of iSync, which has been in beta for several months. iSync 1 supports synchronisation of
contact and calendar information between Mac OS X and Bluetooth devices, including Sony Ericsson mobile phones, but now also iPod, Palms, other Macs and even .Mac accounts (Mac-to-Mac synchronisation actual depends on the .Mac account feature). Clearly, the iPod and .Mac accounts (and other Macs) don't need to have Bluetooth support to transfer data.
**iPod Archives
*Still on the subject of my iPod, it gets used from time to time as a convenient, pocket-sized data transfer device that I'
m unlikely to leave on a customer's site. It's easy to access from any FireWire system and, as a result, data is transferred rapidly, reducing the number of CDs that I need to burn when moving data from place to place. As a result, in some ways, it's become more of a 'system of record' than the matching archives on any of my various computers.
**Curiously enough, this even includes the music files that are now on the iPod. Possibly as the result of moving the PowerBook files from hard di
sk to hard disk, and because of several clean reformats of the drive and at least one disk upgrade, some of the MP3s that I now have on the iPod are no longer on my PowerBook. There are solutions to this, which involve changing attributes on the iPod file system to allow you to copy the files, and then delving into the folder structure. Recently, however, I came across some software that's intended to solve this problem and a few other iPod-related situations. It's called iPod2iTunes from
*I'm as guilty as the next person of making assumptions, and I have to confess one that I missed. For the last two or three years, we've all been writing about wireless networking, how great it is and what the evolving standards have been. 802.11b sneaked up on us and is now the dominant technology for both office and home wireless. It offers 11Mb/sec bandwidth, roughly equivalent
to the old 10MHz wired Ethernet standard still prevalent in many small offices and homes. Despite this apparent speed equivalence, in most circumstances 802.11b is still slower than 10MHz Ethernet because it has to handle airwave saturation caused by many devices all in use at once.
**802.11b works in the 2.4GHz waveband, which has reasonable if not great ability to penetrate walls, but can conflict with some domestic appliances such as microwaves. Regardless, it's a good band to use, as
it already has approval in most countries and is free for our use. Right now we can buy PC Cards, Secure Digital cards, CompactFlash cards, and base stations all supporting 802.11b.
**All along I've been mentioning 802.11a as the most likely successor to 802.11b, which runs at vastly greater speeds - 54Mb/sec - but uses a different waveband at 5GHz. Range should be roughly the same as 802.11b (maybe slightly shorter), which is to say anywhere from a few tens of feet to a couple of hundred
feet in open air, without any special antenna, that is. The advantage of moving to 5GHz is that fewer other devices already use it and so data rates in practice are likely to be better than on 2.4GHz. 802.11a has a standard already approved and a small amount of expensive equipment is available.
**However, it now looks as if 802.11g may end up arriving sooner and displacing 802.11a before it gets started. 802.11g is likely to be approved soon, offers 54Mb/sec transmission with the same r
ange as 802.11a would have, and has the distinct advantage of staying on the old 2.4GHz waveband. As such, it should be upwards compatible with 802.11b installations, so that you could upgrade the wireless hub from 802.11b to 802.11g and start using 802.11g devices on the network immediately alongside your existing 802.11b ones. If you do this, you'll probably have to remap locations to cater for the shorter range, and in a complex multiple access-point installation this could become invol
**Choosing between 802.11a and 802.11g is likely to be decided for us by the manufacturers and legislators. As it is, 802.11a should be slightly ahead in theory with the potential for greater bandwidth. However, in practice, if manufacturers have 802.11g devices available in good time, its compatibility with 802.11b should win the day for them.
**Mobile Communicators
*Although the Nokia 9210 and its successors have found a small but devoted coterie of users, other small 'communicato
r' devices have been notably slow to take off. The Pocket PC family of phones, as well as the Palm ones like Handspring's Treo, have only a small penetration into the business market, appealing more to the gadget buyer.
**In this column last year, I looked at the Pogo and was rather disappointed. Production has now ceased, although its services have been taken up by the main distributor Carphone Warehouse. It's the nearest equivalent the UK business market has to the RIM BlackBerry in the
US (which is best described as a 'pager for email').
**One to watch, however, is the Danger Hiptop, being marketed in the US by T-Mobile as the Sidekick. This is close in concept to the Pogo, although it appears to have better construction quality and greater support. I'd find this device rather more appealing than a Palm Tungsten, if, of course, it were to become available on any of my mobile phone networks of choice. See the owners manual at <a href="http://help.sidekick.dngr.com/inde
x.html" target="_blank">http://help.sidekick. dngr.com/index.html</a>, or other information at <a href="http://www.danger.com" target="_blank">www.danger.com</a>
HNPaul Lynch feels a little run down, but recharging his battery isn't an optionL
April 2003X
102 (April 2003)i
Mobile Computing
Kevin PartnerD
Director MX
Okay, it's humble pie time. I said in my last column that I couldn't see how Macromedia could incorporate Director into the MX family and I was wrong. It's a fair cop, guv'nor. Director MX (aka version 9) is finally with us, but was it worth the wait?
**I'll begin by restating that I like Director as a tool: it's the most powerful general-purpose multimedia authoring tool bar none. It was one of the first authoring tools to appear on the Mac in the late 1980s and its theatre-related meta
reasons
owning
beyond
now-boriW
advantages
using
computer
biggest
challenges
bringing
multimedia
biggest
problems
developing
web-based
applicat
constant
battles
you'll
business
differences
between
people
computers
fastest-growing
buzzwords
millennium
major
problems
technology
windows
software
never
leaves
highl
studio
inordinate
amount
one'sx
one-armed
one-bit
one-box
one-character
one-clickI
one-day
one-disk
one-fifth
one-gigahertz
one-handed
one-hour
one-line
one-liners
one-man~
one-minute
ontrack's
onvaluechanged@
oosoftware@
open-air
open-ended
open-source
open-structure
opener@
openings@
opera
opera
composer
browser
customising
totally
operating@
operation
operational@
operators@
opinionated
opposed@
opposing
opt-in@
optimise
optimiser@
optimistic
optimum
opting@
option
optional@
optionally@
options
optout
opts@
or-together
/mz-r110pda@
organisationally@
organise@
organize@
original@
originally@
originals@
originators@
os's@
ostentatiously@
other@
other's@
otherwise@
otley@
ouit@
ourpoilicy@
ourselves@
ourworld@
resultantC
resultedS
resultingN
retainw3svcstatus
retaliationP
retentionL
retesto
rethinkV
rethoughtH
reticketed
reticulation
retiredb
retouch[
retract
retreated
retrenchment
retrievablex
retrievalW
retrieveE
crypticj
crypto
cryptographic
cryptographically
cryptography
crystalg
crystalize
crystallise
crystallised
crystallize
cs-138d
cs-bytes
cs-method
cs-something
cs-uri-query
cs-uri-stem
cs-username
cs-version
csbump
cscellstyle
cseng
cserverO
csetched
csharp
csharpindex
csharpparser
cshelllink
csidl
csma/ca
csma/cd
csraised
cupsL
cureL
curedV
curiosity
curiousq
+O+O+O+O+O
O+O+O+O+
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O
+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O
demandG
demand-load
demand-loading
demanded~
demandingA
demonstratedJ
demonstratesH
demonstratingn
demonstration
demonstrationsT
demosU
demoting
demystified
demystify
deneban
denial
conformantM
conformedh
conforming
conformity
conforms
confront
confrontation
confronted
confronts
confuseP
confused|
connectionsI
connectivityU
connectixJ
connectix'sJ
connector`
connector-compatibil
connectorsN
startersd
startingI
tingdir
startled
lynch
finds
powerbook
really
apple
lynch
gives
hints
lynch
gives
psion
purple
treatment
netwo
lynch
gives
sony's
latest
top-of-the-range
lynch
search
elusive
checks
lynch
helps
transition
books
electronic
lynch
investigates
drawbacks
finds
lynch
investigates
embedded
applications
lynch
currently
suffering
exhaustion
lynch
dragged
kicking
screaming
modern
lynch
looks
advances
technology
concludes
lynch
meets
credit
wants
newton
insists
thoroughness
while
paulo
perils
photo
photoshopn
picks
pistol_
platform
playgroundZ
plays
pleasurable
re-nationalisationY
phor - stage, casts and scripts - can be traced back to its roots in a tool called VideoWorks. So, to a real degree, Director helped create the market in which I now work. Thanks, Macromedia.
**There's certainly little that can't be achieved in Director. I recently used version 8.5 to create a complex DTP application for the educational publisher Folens (<a href="http://www.folens.co.uk" target="_blank">www.folens.co.uk</a>), which was aimed at novice users and outputs HTML - this product
is now on retail sale.
**I'd originally chosen VB .NET for this project, but using the .NET Framework demands a higher specification of target platform than can be reliably found in schools. My only other alternative, given my deadline, would have been VB 6.
**With various Xtras (third-party extensibility is one of the strengths of Director) and ActiveX controls, Director presents an interface that allows the novice end user to assemble worksheets from a range of pre-created content,
edit these worksheets, preview and print them. As the project files are in HTML, they can be easily uploaded to a website. I particularly like Director's ability to convert from RTF to HTML format with a single-line statement.
**Would this project have benefited from Director MX? Not really. For PC users of version 8.5, the main enhancements are better integration with Flash MX and Fireworks MX, some slight improvements to the interface (particularly when scripting), better debugging and
support for the Flash Communication Server MX. Macromedia also gives a nod toward creating more accessible content for disabled users by including, for example, a text-to-speech Xtra.
**In other words, this is a fairly limited upgrade when compared with the quantum jump from 8 to 8.5, and the upgrade fee of
299 is hard to justify. Mind you, it's possible to order from the Macromedia US website and save yourself around
50 on the downloadable version (depending on the current exchange ra
**If you're a Mac OS X user, Director MX is an essential upgrade, being the first version to support this operating system. Equally, if you're still using version 8, as Rob Burgess from Macromedia says: 'If you've been waiting to upgrade, Director MX is the version to get,' primarily for the entirely new 3D engine added in version 8.5 and incorporated in MX. However, Macromedia had originally announced on its own forums that support for Flash MX would be added to 8.5 for free in the
form of a new Flash Xtra. This means you're paying to enjoy better integration between Director and other members of the Macromedia stable.
**Macromedia is keen for the integration between Flash and Director to be as tight as possible so that professional developers feel compelled to buy and use both products. It's a pity the underlying programming languages of the two aren't more similar: Flash's ActionScript is a modern language similar to JavaScript, whereas even the more modern dot-sy
ntax version of Lingo is Basic-like and dated. My guess is that Lingo will move towards ActionScript in the future as Director becomes the 'Pro' version of Flash. Indeed, it's now possible to directly create Flash objects in Director without needing to do this through a Flash cast member - this gives you access to the benefits of ActionScript objects from within Lingo.
**Further evidence of the merging of these two products is the abandonment of the Shockwave Multiuser Server and its rep
lacement with - you've guessed it - Flash Communication Server MX. This is one of the most extraordinary U-turns I've seen and, despite the fact that the Multiuser Server is still included on the installation CD, you'd be well advised to switch to the more powerful Flash Server.
**If you've spent time and money implementing Shockwave-based multi-user games or learning products, you aren't going to be too chuffed that they've become obsolete overnight, but now's definitely the time to jum
p from that rapidly sinking ship.
**Most professional developers will upgrade only because they have to have the latest version, but perhaps the most conclusive evidence that this upgrade offers little in the way of new features is that the Shockwave player isn't being updated for this release. Don't get me wrong, Director MX is the most complete authoring tool on the market today and the integration of Flash MX makes it the ideal product for creating CD-ROMs, DVD-ROMs and, in particular,
for hybrid productions that feature offline storage while connecting to the Internet for collaboration, scoring and updating.
**But why is it that Microsoft's policy of strong-arming its users to upgrade is so derided and yet Macromedia's treatment of the Director community hardly gets a comment? It's particularly disappointing given that upgrades to its other products, especially Flash and Dreamweaver, are always worthwhile and groundbreaking.
**The MX interface itself makes almost no
difference to the way you use Director. It smartens up the interface a little, resulting in fewer pop-up windows, but doesn't interfere with that basic, archaic theatre metaphor. The Scripting window has been improved a little. New buttons list 3D Lingo commands, and particularly useful is a button that automatically lists commands added by any installed scripting Xtras. The message window is now more usable, as the commands are separated from their results and Director MX adds an Object
Inspector that allows you to follow the properties of script instances and watch the elements within 3D cast members and Flash MX sprites.
**Perhaps the most impressive day-to-day change is the close integration with Macromedia Fireworks MX. Director has always had a built-in image editor, but it's rubbish, so most developers create graphics in an external application and import them. Now you can edit a cast member in Fireworks by clicking the Edit button in Director's Property Inspector
window and, on leaving Fireworks, have it updated without having to delete the cast member and re-import it. The integration with Photoshop isn't quite as close, but still far better than previously.
**Director's coding environment still isn't as professional and usable as, for example, the Microsoft .NET editor, but it's far better than the bare-bones editor shipped with main competitor Mediator EXP. In summary, Director MX feels more like version 8.6 than the version 9 it's intended to
be (as indicated by the licence number). Is it worth the
299 upgrade fee? Not really. Have I paid to upgrade? Yes.
**Mediator EXP 7
*Perhaps Director's strongest competitor in the field of CD-ROM development, version 7 of Matchware's Mediator EXP is now available. No doubt the product will be the subject of a full PC Pro review in due course, but I'd like to take the opportunity to describe how Mediator now combines ease of use with real power.
**My company, NlightN Multimedia (<a href
="http://www.nlightn.co.uk" target="_blank">www.nlightn.co.uk</a>) is currently developing a huge e-learning product that involves a number of designers/programmers working concurrently. I selected Mediator EXP rather than Director for this project because of the productivity benefits it offers while also providing a wide range of features.
**One of the main problems with having a number of designers working on a single project is that they each have their own techniques as well as a ran
ge of experience levels and technical abilities. This can lead to the nightmare scenario of half-a-dozen different programming styles that are next to impossible for anyone else to understand.
**So, I'm creating a range of standard page types that the designer simply has to modify. Users of Mediator in any of its incarnations will know that making drag-and-drop user interaction is pretty easy, but it does involve a lot of repetitive programming each time a new interaction is created. By
using a combination of a standard, modifiable page and a block of VBScript, I've been able to limit the programming input required from the designer to a single line. The other, huge, advantage of doing it this way is that I can make a change to the code and have that reflected across the entire program.
**Take a look at the code fragment in Code Listing One. It's a simple FOR-NEXT loop that reads in the initial x and y position of each drag-and-drop object. These objects will be named dr
ag1, drag2 and so on, and line 1 associates the variable vbMediatorObject with each drag object in turn. I can then use the simple form shown in line 2 to fill a two-dimensional array with these positions using dot syntax. This means that the designer is free to place the drag objects anywhere on the screen and Mediator will automatically read them in. This is important, as it allows the drag objects to snap back to their original position if the user gets the question wrong.
**Any proper
ty of any object can be accessed in this way. For example, you could store the current frame number of a Flash movie, the border colour of a rectangle or the transparency level of a picture. And, of course, this also works in reverse. For instance, vbMediatorObject.x=300 sets the x position of the Mediator object associated with vbMediatorObject to 300.
**Although it will always be limited by the capabilities of VBScript, Mediator EXP has evolved way beyond its roots as an easy-to-use mu
ltimedia authoring tool. It also continues to improve in its traditionally strong areas. Perhaps the most underused feature introduced in version 6 of Mediator EXP is the Animation Track, which allows you to control visible objects, manipulating their properties as they move along an animation path. Exactly which properties you can manipulate depends on the type of object.
**As an example, I've dragged a polygon object from the Multimedia Catalog to a position just off the screen and adde
d a button to start the Animation Track. The screenshot shows the Animation Track dialog box. A PathPosition track is added automatically. At first, it looks complicated, but it's very intuitive. The bottom axis shows the elapsed time and the vertical axis the percentage position of the object along the path. The small squares represent nodes, or positions along the track. So, if you look at the first node, you can see that it represents a position around 50 per cent along the track after
just over four seconds. The next node is around 40 per cent, so for the period between these two nodes the object is going backwards along the path. It then reverses direction again and progresses to the end.
**The more nodes you use, the closer the control you can achieve. By clicking the Add Track button, you may choose from any of the object's properties. In the case of a polygon, it's possible to choose a track based on anything from the object's opacity to its centre of rotation. In
my example, I've added two more tracks. The first affects the rotation of the object between zero and, in this case, 720 degrees. I've added nodes at the same time point as the PathPosition track, so each change happens simultaneously. The first node represents the polygon rotating through around 700 degrees. The downward line between the first and second nodes indicates the shape then rotates backwards to zero before spinning more quickly between the last two nodes.
**The second new trac
k changes the opacity of the object, taking it from zero to 100 per cent at the first node, then fading out slightly before fading in again. Finally, I've added a different sort of track to change the background colour of the object. In this case, the object changes smoothly from red to orange at the first node, then to purple before ending at red again. I could add nodes covering every property if I wished.
**You can see from this example the extraordinary power Animation Track offers in
such a simple form. Once you understand the unique but powerful interface, you can manipulate an object in any number of ways, far more simply than you could in, for example, Flash or Director.
**Opus Pro XE
*Digital Workshop has been a little quiet lately, following the flourish of activity that saw the release of several versions of Opus, including a plug-in that offers Flash export. However, the firm has recently extended its range with a product aimed squarely at the professional dev
eloper. At
399.95 (inc VAT), Opus Pro XE is
150 more expensive than previous top-dog Opus Pro, but is still substantially cheaper than Mediator EXP 7 and a fraction of the cost of Director MX. Opus Pro XE includes all the features of Pro, most importantly an ECMA-compliant scripting language in addition to the drag-and-drop authoring approach of the entire Opus series. In this way, Opus offers similar features to the Mediator product range.
**XE is targeting people who earn their livin
g from multimedia creation. First, it includes copy- protection software, which only allows a presentation to be run from the CD-ROM. Protection such as this is notoriously difficult to implement, so it will be interesting to see how it works in practice - at the very least it should deter casual software theft. Similarly, you can now embed video clips into your Opus presentation to prevent their theft.
**XE also adds the facility to create time-limited evaluation versions, so that poten
tial purchasers or clients can try out the software without risk to your revenue stream. However, it's not hugely challenging to add time-limiting features to a product created in Mediator or Director.
**The software can also record the time spent working on a publication, making it simpler to bill clients accurately. It also means it's easier for a number of programmers to work on a single publication, as it allows any programmer to attach notes to a page.
**Opus Pro XE distinguishes it
self in its support for the AICC Learning Management System standard. This format was created by the Aviation Industry CBT Committee, originally in 1988, to standardise the way training applications communicate with specialised databases. These databases record information about the learner, including their performance, test scores and progress.
**All major Learning Management Systems support data transfer using the AICC standard, so if you're creating corporate learning applications Opu
s Pro XE's support will make interacting with your client's LMS much simpler (that is, until the newer SCORM standard becomes widely adopted).
**It's possible, once you understand the AICC data format, to create an application in just about any development tool that supplies the data in that format: it's basically just structured text. However, Opus Pro XE is by far the simplest and cheapest way to get into AICC-compliant corporate training, and it's to Digital Workshop's great credit th
at it has spotted this gap in the market.
**Although you can get templates for Flash that make it AICC-compliant, the main players in this field are Macromedia's Authorware and Click2learn's ToolBook Instructor, both of which will set you back around
2,000. With Opus Pro XE, Digital Workshop is clearly making a play for the corporate training market by addressing most of the weaknesses for professional use of Opus Pro. Oddly enough, its Achilles heel may be the low price. I'll let you ka.now more as I get better acquainted with it.
HMKevin Partner has to eat his words as he looks at the latest Director upgradeL
April 2003X
102 (April 2003)i
MULTIMEDIA:VIDEO
Steve CassidyD
Junkyard computing
I+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+OE+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O
,+O+O+OO+O+O+O+O+O+O+O+O+O
OO+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+{
z{\z{
z{\z{
z{\z{
z+O+O+O+O
O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O)+O+O+O
O+O+O+O+O+O+O+O+O+O+O
toolk
wwwwww
UUUUUU
DDDDDD
""""""
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O
O+O+O
+OO+O
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+Oz
O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+
I+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O
I+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O
I+O+O+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O
I+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O
~ffff
ffffff
ffff33
ff33ff
ff3333
33ffff
33ff33
3333ff
333333
multiuserk
multivendor
multiversion
multivoice
multivoltage
multivolume
multiway
multiyear
mumbled
mumblings
mummy
munch
munched
munchers
munches
munching
mundanea
munger
munging
munich
munitions
murder
murdersb
murdoch's
murkyd
murmur
murphy
murphy's
murray
muscat
muscle
muscle-bound
muscles
muscular
mused
museumb
museum-piece
museums
mushroomed
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
I+O+O+O+O+O+O
+O+O+O
+O)+O+
+O+O+OO
OO+O+O
OO+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O
400-minute
400-page
400-plus
4000[
40000
400000u
4000dpi
4000mb
400dpi
400gbr
400kbW
400mb
400mb/seca
400mhz
400ms
404sS
40comupd
40dpi
40gbr
40in2
40kgb
40mb]
40mbits/sec
4300s
433mhz
434241
toolboxX
tooled-leather
toolhelp
toolintf
toolkitJ
toolkit's
I have a few weaknesses I will admit to, and one of them is being a bit of a junkyard dog. Recycling computers is an unnatural act to discuss in the middle of a magazine whose cover is devoted to the newest and fastest stuff you can currently find in the retail channel, but there's something I find both appealing and revealing about a solid rebuild of an old warhorse.
**It's appealing because you get to find out all the things you did wrong, or just how much better the latest drivers are
than the first releases or the Microsoft standards (see my comment on Sun Java later on). It's revealing because it's astonishing how much of a new lease of life awaits machines that are held back by nothing worse than a slow hard disk, not enough memory (which now costs effectively nothing) and a need to be in use 24 hours a day.
**There's good news and bad news in such junkyard diving. The good news is that I've collected a stable of machines with chequered histories, but which still t
urn over real work for me. Most of these are like the proverbial axe - new head, new handle, but still the same axe - and I'll cheerfully admit the uneconomic amount of time I spent reloading a Compaq Pentium II/350 when I could as easily have given someone
299 for a 1GHz Celeron.
**I persisted because almost every operating system I've found will run cleanly and immediately on a Compaq, whereas many of the machines that populate the bottom price bracket have little gotchas up their sle
eves, courtesy of cost-cutting motherboard designs and neat but wacky integrated bargain-bin chipsets.
**Then there's the speed-up you get as a 'goodbye present' from a lot of the first-tier equipment makers. I have motherboards here for Slot 1 Pentium II and III CPUs that were originally rated no better than 450MHz from both Supermicro and Abit. However, download the last revision of their BIOSes and you'll be able to go all the way up to 1GHz (so long as you have the dowsing rod, cryst
al ball and large roll of banknotes necessary to be able to unearth a Slot 1 processor that is (a) that fast and (b) for sale in the first place).
**So what has all this got to do with networks? That's where the bad news comes in. It's remarkably difficult to tell the difference between an old refurbished machine and a slightly off-colour new one. It's also remarkably easy to make a new machine off-colour if you don't have the time and attention that can only be spared by systems integrat
ors working in tens of thousands of units, or by hobbyists like me who go into the basement to get away from all the Christmas cheer, and fiddle with PCs to a wholly uneconomic extent. I have two cases to present as illustrations of this matter: one I class as fairly inevitable, the other came as a real surprise.
**Case 1: AMD Athlon, Abit motherboard. I've mentioned this machine briefly in a past column, most likely when it was playing host to my Supermicro P6SBA motherboard. Originally,
this was an HP Brio PC, but its Pentium II/350 and HP proprietary mobo are long gone. Now it has a 1.2GHz Athlon and an Abit motherboard with an on-board IDE RAID controller - the kind you shouldn't be using in servers, no matter how tight your budget because, although they're an array, they surely aren't redundant.
**This machine plays my games, burns my CDs and gets test software thrown at it, but after a while I noticed that gameplay and DVD playback had become notably jittery. Some r
esearch on the Web revealed that by a slip of the designer's finger, this machine marries built-in motherboard resources to specific PCI slots. If you put some other card in the slot that also happens to sit on the on-board IDE RAID, disk I/O goes to the dogs. Of course, on a 1GHz machine sitting in front of a rack-load of dual 450MHz and 500MHz servers, that I/O problem frequently gets hidden in the traffic, until you try to do something really demanding.
**The last straw was when I dro
pped in a junkyard find in the shape of a Matrox video-editing card, which wouldn't record video and save it to a networked drive for toffee. Jitter, jitter, drop-out, crash... A BIOS upgrade accompanied by exactly the right driver set from the Abit site, a judicious reshuffle of PCI cards in the slots, disabling various on-board bits I don't use (like the serial and parallel ports), and a complete blank disk rebuild from scratch was all it needed to turn this into a steaming fast, ten-yea
r- life workstation.
**Case 2: A Dell Dimension Pentium/166. I always had a hunch that these old Dell desktop cases were actually more standard than the parts inside them, and I had an interest in a fairly new Pentium 4 motherboard made by Supermicro, my favourite manufacturer. This device, the P4SBA+, runs a 2GHz Pentium 4 while still using 100MHz DIMMs for memory. It only took a few minutes wandering around <a href="http://www.dabs.com" target="_blank">www.dabs.com</a> to put an order t
ogether and, with a little screw-drivering when it all turned up, I had a machine that on the front says 166, but inside is 12 times faster.
**These two machines are nominally a whole gigahertz apart in performance. The difference becomes shatteringly clear at boot-time and when playing games, but (and here's the important point) the rest of the time I barely notice which one I'm using. I don't have any strong preference to turn one or the other on if I'm going to do some work, because mo
st of my work is server-centric. In truth, if you gave me a Pentium/233 with 200MB of RAM I'd be reasonably happy, because the rest of the network supports it remarkably well.
**Tale of woe
*That brings me to the bad news. A sorry tale has reached me that illustrates my recent observations about the demise of the TV repairman, and how his fate could be repeated in the IT marketplace as PC costs fall. I never expected to hear a tale like this one so soon.
**The traditional model of 'PC s
upport' in small businesses may come as a bit of a shock if you're a capable, hobbyist fiddler or a large-company person. The model is defined by a formula: a company agrees to send you a man in a white van whenever you have a problem in return for 12 per cent of the capital value of your IT investment per annum. In recent years, that 12 per cent has become subject to top-ups and often these firms take intelligent advantage of their relationships, making more out of the deal through hardwa
re and consumable sales.
**This is computing as defined by central-heating fixers: as margins squeeze down, the maths that define their relationships with their clients get tighter and tighter. Of course, there's the same spread of character types in this business as in any other sector of commerce, but the bad ones are, as things get tougher, starting to show some nasty eccentricities.
**In the case in question, it seems the 'maintenance company' had come to realise what I'd long since
concluded - that you can leave most people running with a Pentium II/450 and they'll work pretty happily so long as it gets rebuilt every few years and fed a steady diet of hard disks. In this case, the company had put a pile of old bits into some new cases, sold them to the company at new-machine prices and then been called to account once the situation became known.
**Details become sketchy at this point, but it seems the discovery of this situation and the ensuing negotiations ended
with the maintenance 'engineer' marching through the office, throwing PCs out of the windows...
**This is, of course, an extreme case and also a one-sided story, but I don't need to see a pile of crash-landed PCs in someone's car park to know what's happening here. As costs fall and the pace of turnover of kit slows because people's 'good enough' flag has been raised, so those businesses that have been relying on the simpler forms of box-shifting and surfing on the back of the hype from t
he bigger players will find it ever harder to make ends meet. When someone turns a business relationship into an audition for 'One Flew Over the Cuckoo's Nest', it suggests that this business sector needs to think of a new basis for doing what it does, and quickly.
**A wise man once told me that every recession is different and, hence, little that was learned from the last one - by the hardest-hit industry - is of any use to those hit in the next one. Looks like this time around it's theR5 turn of the computer business to learn some lessons.
HCSteve Cassidy turns rag-and-bone man before jazzing up his findingsL
April 2003X
102 (April 2003)i
Networks
Davey WinderD
I spy
O+O+O+Oz
+O+O+O
+O+O+O+
+O+O+
+O+O+
+O+O+O
+O+O+O+OO+
+O+O+
+O+O+
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O{
*+O+O%
O+O+z
O+O+z
+GO{*
O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
+O+O+O
+O+O+O
+O+O+
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+OO+O
+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+
O+O+O+O+O+O+O+O
%U%UI1
O+O+O
zO+O+O
I+O+O+
plucking@
plug-and-play@
pluggable@
plums
plunged
plural@
plus/minus
plymouth
poached@
pockets
pocketstudio
pogo@
point
point-and-click
point/click@
pointer
pointer's@
pointless@
points
pointtoclient@
policing
policy@
policy-based@
polka
polygon@
pond/ljbeta/win2k
pondering@
poor-sighted
poorly@
pop-out@
pop-ups
pops@
popularity@
populated
populations@
port-mapping
portable@
portables@
portrait-format@
portugal
posed@
positional@
possessing@
possible@
possibly
post@
post-it@
poster's@
real-time
real-world@
realise@
realising@
reasons@
received@
reception@
recommend@
recordset@
slottingT
slouch
slouching
slough
slovak
slovak'sq
slovakian
slovaktechq
slovenian
slowJ
dropping`
drops
dropshadow
dross
drought
drove
drown
drowned
drowning
drsps
dru-500a
drudge
drying
dsc-f707
dsmith
dsnsz
dsosversioninfosize
dsp-based
dtdatetimeo
dtetime1
dtetime2
A decade ago, I was writing about two distinct 'wares' as far as the online community was concerned: shareware and WaReZ (yes, software pirates pre-date Ali G in the comic abuse of EngLish SpeLLinG). Unfortunately, the latter are as active as ever, and the increasing availability and affordability of broadband connectivity has allowed WaReZ to move off campus and firmly into both the domestic and corporate environments.
**The distribution of cracked, pirated, unlicensed software remains
as illegal as ever, but fast web connections, peer-to-peer networks and ever-changing IM group hangouts all mean it's sadly easy to find and download everything from Windows XP Pro and Office XP to those applications previously too large for downloading like Photoshop and Quark.
**As for shareware, that has changed almost beyond recognition, splintering into all kinds of niches to capture a global market that demands something, anything, that will grab the ever-diminishing attention span
of the Internet consumer.
**So it is that we find ourselves faced with a multitude of options when looking for a FTP client, for example. Beyond the obvious questions of suitability for task and feature set, the next big tick on the list is often How Much? The Internet generation tends to favour things being free, and that includes software. Much of what was shareware has become freeware, or is offered in both free and paid-for versions - the former, there being no such thing as a free
lunch, subsidised by sponsored advertising banners that can't easily be removed.
**It's ironic that in the rush to get something for free, vast numbers of Internet users are sacrificing their treasured right to privacy on the altar of advertising-sponsored freeware. Let's get this straight: not all freeware impinges upon your privacy, and not all invasions are perpetrated by banner ad-displaying software. To understand the whole picture, look closely at the definitions of three new wares
that are more prevalent in the modern online age: adware, malware and spyware.
**Warez the beef?
*It can be hard to properly distinguish these three terms, thanks to sloppy thinking among the online chattering classes (for whom fact never gets in the way of a good Usenet posting) and the IT media, which can be just as guilty of spreading half truths and 'factions'. We need to be clear on these definitions if the commercial Internet and the understandable desires of its users are to coexi
**Adware is, at its simplest, just the use of sponsored adverts to finance the continued development of free software. The user will often get a choice between buying a licence and getting an ad-free version, or taking the 'free' route and suffering the embedded ads that come with the package. Such adverts will usually take the form of a banner, scrolling ticker or a dedicated frame/window within the application itself. However it materialises, you can't easily disable or remove it,
and importantly the viewing of such adverts results from a conscious decision taken by yourself.
**At the opposite extreme, we find malware - malicious software - which is specifically designed to damage or disrupt your PC, typified by the virus or Trojan. True adware, despite what you might read elsewhere, is far removed from malware and doesn't deserve the bad press it gets. What not only deserves bad press, but also merits much wider media coverage than it currently gets (probably bec
ause it's a technical issue and often not understood sufficiently by those who would write about it) is the last of our 'wares' trio, spyware.
**Falling somewhere between adware and malware, spyware isn't illegal (to the best of my knowledge), though European data privacy legislation could soon dramatically change this. It's most definitely unethical and deeply disturbing, the more so because it often purports to be straightforward adware or even to be non-sponsored, no-catch shareware.
Yes, the sad news is that you can pay for the privilege of having spyware installed on your PC and deliver your personal data directly into the hands of online advertisers. Almost without exception, spyware is stealthy - why do you think it's called that?
**You install software A, but this process also installs utility B, the spyware component. Spyware components are best thought of as demographic/marketing data-collection and analysis agents, tracking your online movements, preferences
and activities and then delivering this data to some third party with whom you have entered into no explicit agreement. Of course, the reality may be that you actually did sign away your right to privacy and control because you didn't bother to read the small print of the end-user licence agreement, but most folk simply hit the 'I Agree' button during installation (perhaps there's a lesson to be learned).
**The end result is that your privacy is breached, your control over how private da
ta about you is used removed, and - most of the time - this all happens behind the scenes without you knowing such information is even being collected, let alone collated, stored and used to point ever more supposedly tailored advertising at you (just for the record, I don't need Viagra, I'm happy with the size of my breasts, and I can probably survive without a remote-controlled micro-racing car, thank you very much).
**Privates on parade
*Assuming that you do care about privacy and tha
t you accept you may have installed software containing a spyware element without your knowledge, how may both your privacy and system integrity have been affected? Yes, you read that last bit right, there is more at stake here than just control over personal data (as if that weren't enough).
**As a PC Pro reader, you most likely take the long-trousered approach to computing both in the office and at home, and a big part of being a grown-up PC user is being in control of what software yo
u install on your standalone or networked PCs. Anything that gets installed by stealth has to be bad news. How can you make any sensible decision about how it might impact upon system integrity, stability and functionality when you don't know that it's even there, let alone what it does or how it operates?
**And this doesn't even start to touch upon thorny issues such as the responsibility of an employer to abide by the conditions of an Acceptable Use Policy that may form part of a contr
act of employment, and which will probably lay out the conditions under which that employer may monitor online usage and movements. What if your employer doesn't know that a spyware agent is monitoring such activity and reporting back to some equally unknown third party, but a clued-up employee finds out?
**Although removing spyware usually results in the 'host' application ceasing to function reliably, if at all, the reverse procedure of removing the host application that has installed s
pyware components doesn't in itself guarantee a cure. More often than not, the spyware parasite will hide itself in the Registry and not let go, nor stop transmitting that marketing information. In some cases, these spyware components are so intertwined into the system Registry that all but the most precise of removal strategies can result in damage to system integrity, such as loss of Internet connectivity or network instability.
**A good place to learn more about the perils of spyware i
s <a href="http://www.tom-cat.com/spybase" target="_blank">www.tom-cat.com/spybase,</a> where you'll also find the excellent TomCat Spyware List utility. This online tool consists of a quick and easy web-based front end to a database containing details of a thousand or so spyware components and the host applications that install them. It's not foolproof, although it does make for a decent first defence against spyware infection in that you can query the database by host application name or
part thereof.
**For example, if you were looking for a shareware FTP utility, it's easy enough to search on 'FTP' and get a list of software that's flagged either as a confirmed carrier of spyware, under investigation or cleared. You can then follow the click trail deeper to reveal exactly what spyware is installed by any particular application and make your own grown-up decision about whether to proceed further or not.
**TomCat also allows you to search the database by spyware compon
ent and reveals equally detailed information as a result. Of course, if you spot an unknown process at work using Task Manager or similar, you can just as easily slap it straight into Google and more often than not reveal just as much detail, but I quite like removing a layer of unnecessary research time by going straight to a dedicated resource such as this.
**The soft option
*That brings me nicely on to the second line of defence, which assumes that you're not sitting in front of a bran
d spanking new PC, unblemished by software installation, or have not vetted all installations via TomCat before allowing them anywhere near your machinery. Back to the real world then, and the task of playing detective and turning the tables by spying on the spyware. There are numerous tools out there that claim to do this job, but some are frankly incapable of telling their virtual ass from their virtual elbow, while others assume that every .DLL is dirty and destroy your system directory
with a hail of deletions.
**The best application - for as many years as I have been aware of this subject - has been Ad-aware by Lavasoft (<a href="http://www.lavasoft.de" target="_blank">www.lavasoft.de</a>). Having the advantage of being free to use, this product has built a user base of millions, some of whom (like myself) have coughed up the paltry $15 (
9.50) to register the 'plus' version and help keep development going, while getting in return some additional configuration option
s and a bonus 'ad-watch' real-time monitoring module thrown in for good measure.
**You may notice that I said best in the past tense, which perhaps isn't entirely fair as I suspect Ad-aware will regain its crown once the version 6 upgrade is released. But for now, those of us who have relied upon Ad-aware 5.83 to detect and destroy spyware must think about an alternative if we are to keep our systems clean and our movements as private as they can realistically be online.
**The problem
arises out of the need to provide users with regular 'reference file' updates containing details of the latest spyware components, exploits and mutations. These are downloaded and integrated into the Ad-aware engine automatically (if so configured) and, because the system has always worked so well, most users have treated it as a fire-and-forget solution.
**This is fine, as long as those reference file updates keep coming, but as I write these words the last such update was dated 24 Sept
ember 2002. This lack of essential updates isn't due to a lack of motivation from the user base who help to supply details of new exploits and mutations of existing threats using the accepted submission procedure. Such users have noted that, despite their submissions, no updates have been forthcoming, and some well-respected and high-profile folk - including previous administrators of the support forum - are reporting that their complaints have been deleted.
**Whatever the behind-the-sce
nes politics and manoeuvrings, one fact is crystal clear: there's such a gap in the reference updates that it's irresponsible to suggest anyone rely on Ad-aware until this situation is resolved. I hope this resolution will have been reached by the time you read this.
**Posting in the support forums at <a href="http://www.lavasoftsupport.com" target="_blank">www.lavasoftsupport.com,</a> an official statement apologises for the inconvenience, but informs users that '...to effectively and s
afely remove many new targets and those older ones which have changed/mutated, we would need to do a major upgrade of 5.83 and its reference file. The 5x engine (in its current configuration) cannot handle many of the new targets correctly... the entire Ad-aware executable would need to be rewritten.'
**And that's just what's happening. Version 6 is under development and promises to be more than just an update, but rather a whole new application. In light of this, the statement continues
: 'We decided that any further upgrades beyond 5.83 would be more of a waste of time than productive as the coding work for version 6 was already under way.' A release schedule for registered Ad-aware Plus users was set for mid-January, with a free version coming in February. Ann-Christine Akerlund, one of the owners of Lavasoft, added an assurance that, in spite of rumours, the version 6 upgrade will be free for paid-up users.
**This is all good news, assuming those schedules are met and
no more of the delays that have disrupted the development process intervene. Whatever the case, Lavasoft will have to pull out all the stops with its Ad-aware revamp and make the new product as exciting and functional as the one that appears to have stolen its crown in the meantime, Spybot S&D. Initially popular because, like Ad-aware, it comes as a freeware utility, Spybot (<a href="http://http://spybot.eon.net.au" target="_blank">http://spybot.eon.net.au</a>) has maintained the growth o
f its user base largely through word of mouth of satisfied users. There's no 'Pro' or 'Plus' version of the software, although if you're feeling generous you can make a donation to help development continue.
**The S&D stands - rather annoyingly - for Search and Destroy, but get past this frippery and you discover an application that does what it says on the box and then some. Like all such software, it demands attention from the user: when you're mucking around with the Registry and remo
ving .DLLs at the very least you need to read the FAQ. Cruising through on auto-pilot and just clicking all the 'Yes, do it' buttons could end with you doing more harm than good. However, Spybot does come complete with a 'recovery' option that rebuilds what it has knocked down, step-by-step, so you can even spot where the damage was done.
**Another neat feature is the colour coding of alerts. If something is flagged in red, Spybot is pretty sure it's a spyware problem and these are pre-s
elected for removal (although I still maintain there's a need to check through manually to make sure before doing the deed). Black entries are system internals best left alone unless you really know what you're doing (if you don't understand what these are, leave them alone). Finally, the entries marked in green are application usage tracks, which you can safely remove on an application-by-application 'need to know' basis.
*What separates Spybot from the crowd - and that includes most o
f the commercial products flooding the market - is the depth of detail it gives when you hit the Description button. This pops up a 'more information' box, which highlights the company concerned, the product and nature of the threat. You get URLs taking you to the relevant developers' sites, including privacy policy pages where appropriate, a pr
cis of that policy if available, and a full and complete description of what the component does and why. It is both impressive and educational, an
d I've certainly learned a thing or two since using it. Make sure you use the update function though, as this updates both the core Spybot engine and the necessary reference files to keep on top of new exploits.
**Throw in its excellent logging options, flexible configuration and even value-added bonuses such as a file shredder for more permanent deletions and Lavasoft might have a job on its hands clawing back those users who've jumped the Ad-aware ship in favour of the Spybot boat whil
-e the development seas have been heavy going. Me, I'll use both. In fact, if you want to take a grown-up attitude to online security, privacy and system integrity issues, you'll adopt a multipronged strategy that includes anti-virus protection and firewalling as well as spyware scanning and the like.
HODavey Winder opts for stealth mode as he delves into the murky world of spywareL
April 2003X
102 (April 2003)i
Online
Tom ArahD
Filter tips
Two issues ago, I looked at the huge power and creativity that the use of bitmap filters can open up, from the basic blurring and sharpening operations through to advanced artistic effects. This month, I'm going to explore how Photoshop users can bring out the full potential of the dozens of filters available to them.
**First, you have to make the most of the power offered within each filter, which is easier said than done. The sheer range of options means no-one can intimately know the
effect of every parameter. Most parameters are controlled via gauges, but because of often unpredictable interactions between different settings there's often no obvious connection between increasing or decreasing a particular parameter and the effect it produces.
**The best way to get to grips with each filter is to experiment, and a useful tip for Photoshop and Elements users is to hold down the Alt key while dragging a parameter's gauge, as this updates the small Preview dialog in rea
l-time. Pressing Alt also turns the Cancel command into a Reset command that restores all parameters to the last applied settings or in-built defaults. If the filter allows, it's desirable to preview the effect on your image as a whole. For screen work, it's important to set your view to 100 per cent to see how your actual pixels are affected, while for print work it's also useful to see how the effect will work to scale by switching to Print view (Photoshop lets you swap views even while
the Filter dialog is open).
**Most filters are capable of producing vastly different effects, each of which might be desirable in different circumstances. The more advanced programs recognise this by providing filter presets, which are simply stored parameter settings. Even more useful would be the ability to save and load your own commonly used settings, and in Photoshop you can achieve the same effect by recording filter settings as part of an action from the Action palette.
**You cou
ld, for instance, set up separate Unsharp Mask actions, with different diameter and threshold settings for images destined for screen and print, or set up Chalk & Charcoal artistic effects with different colours and weightings.
**It's crucial that you get the effect exactly as you want it in advance, because all Photoshop's filters are 'destructive', which means that once applied the original input pixel values are lost. In other words, unlike changes made via adjustment layers or layer s
tyles, you can't recall the dialog at some later stage and retrospectively fine-tune the effect. Hence, if the effect on the image as a whole isn't exactly what you expected, hit <Ctrl-Z> to undo the filter, then <Alt-Ctrl-F> to call up the dialog again for another try.
**Both Photoshop and Elements offer slightly more flexibility via the History palette, which lets you undo back to an earlier image state. In Photoshop, you also have the option of storing an image state using the palette
's Snapshot command. This is particularly useful with filters, making it possible to store variations of an effect and then view them quickly in turn to select your favourite. You can even create an action that automatically applies different filter settings and stores the results as selectable snapshots.
**Using these tips, you should be able to get nearer the results you want, but usually you'll still want to adjust the end effect. Sometimes a filter effect may appear too weak and your
best option is to re-apply it (<Ctrl-F>), say, to boost sharpening. More commonly, the effect will be too strong and you'll want to tone it down. In Photoshop, the Edit | Fade command (<Shift-Ctrl-F>) merges the before and after image states. Essentially, the Fade command is a filter in its own right, taking each pixel's before and after values as its inputs - set Fade to 50 per cent and the pixel value is calculated by splitting the difference between the original and filtered values.
*While the most common use of the Fade command is to tone down the filtered effect, it can do much more, thanks to the use of blend modes. By setting the blend mode to Darken, the filtered pixel value will only be used if it's darker than the original. Set the blend mode to Difference and the final colour becomes the difference between original and filtered values, producing unpredictable and eye-catching results.
**This mixing of original and filtered image states is great for producing
both subtle and striking effects, and in many ways the Fade command should be seen as the most useful of all filters. As with every other filter, its effects are destructive - you can't retrospectively fine-tune your filter's strength. However, you are able to get around this and even provide similar capabilities within editors that don't offer a Fade command by the intelligent use of layers.
**Select your original image, copy it and paste it as a new layer, then apply your filter effect
to the new layer. You can fade the effect by changing the new layer's opacity, or create blend-based effects by selecting from the full range of layer blend modes. Crucially, this approach allows you to change opacity and blend mode at any time, and because you've kept a copy of the original image you can always delete the filtered layer and start again. While not as flexible as a non-destructive layer style or adjustment layer, this trick is a definite step forward.
**Behind the mask
pplying filters globally to the entire image is easy, but you may want to sharpen only particular features, to motion-blur a stationary object or to apply a canvas texture to just the white areas. In fact, controlling where filters are applied is also fairly simple - just make a selection first using any of Photoshop/Element's selection tools or commands.
**If you're using Photoshop rather than Elements, you can apply filters to selections or even individual channels by first selecting o
ne from the Channels palette. This can create subtle effects - for example, applying a sharpen filter to just the colour channel with the highest contrast - or striking effects, such as applying a coloured pencil effect just to an image's blue channel.
**You may also apply filters based on Photoshop's user-defined alpha channels or masks. This is particularly powerful as you're able to control the effect just as with the Fade command, but intelligently varying its strength across the imag
e. By creating and selecting, for instance, a radial gradient alpha channel before applying a Gaussian Blur, it's possible to produce depth-of-field effects that draw attention to the focus of your image. Do the same and apply any Brush Strokes filter and you can create the effect of a photograph seamlessly turning into a painting.
**Using selections and masks with your filters offers great creative control, but the need to know exactly what effect you want to produce and set it up in ad
vance is a serious hindrance to creativity. An alternative approach in Photoshop is to use the History palette to take snapshots of the original and filtered images and then to select the original snapshot.
**Using the History Brush (shortcut Y), target the filtered snapshot in the History palette by clicking its source box and then selectively paint the filtered colour values onto your original image. You can even change the History brush's opacity and blend mode to take further control
, and target the original snapshot to undo any changes. Effectively, this means it's possible to interactively paint onto your image with any variation of more than a hundred filters.
**In many ways, the History brush is Photoshop's most powerful and adaptable tool, but it still has one drawback: when you close the image, all snapshots are lost and any changes are made permanent. Alternatively, you can maintain maximum non-destructive flexibility by again turning to layers. Store your bef
ore-and-after image states as separate layers and then add a layer mask to the topmost filtered layer using Photoshop's Layer | Add Layer Mask | Reveal All command. When the layer mask is selected as the current target, painting with a solid black brush reveals the underlying original image layer, while painting with solid white re-applies the filter effect to any original areas that you've revealed. Painting with grey creates a semi-transparent mask that merges the before and after image
states. By adding a gradient to the layer mask, you can even manage global fade effects, so - because you can always change the gradient - the mixing effect remains live and editable.
**Through the intelligent use of layers, you've now taken total and fully editable control of the mixing of an original image with any given filter, but don't get carried away. We might have managed to make the Fade command non-destructive, but there's still a major limitation that becomes apparent if you wa
nt to apply a second filter. Filters can't be applied like an adjustment layer to all underlying layers, and so your only option is to flatten the layers, making all changes permanent and immediately losing the flexibility and editability you've laboriously built up.
**One of the biggest strengths of filters comes through their combination. Before applying a Find Edges filter, for example, it's often useful to apply a Colour Reduction filter. And the best way to produce a brushed metal fi
ll is to apply some Noise followed by a Motion Blur and so on. If you thought filters on their own provide enormous creativity, combining them means the variations become virtually infinite.
**It's worth thinking about this in more detail. Each filter works by taking original input pixels and applying a mathematical operation to generate a set of new values. In more advanced filters, such as some of the artistic effects, these new values are then used as the input values for further oper
ations, so that the final filter effect is the result of a stack of serial operations. And, crucially, there's no reason why this stack shouldn't straddle multiple separate filters, in which the separate filters themselves act like the parameters of a larger overall effect.
**But there's a problem when it comes to controlling the final result of these combined filters. It's difficult enough to predict and control the interaction of parameters within a single dialog, so how can you experi
ment to get the best result from the combination of various settings when the parameters are split between different filters? With adjustment layers and layer styles, this isn't a problem. If you're combining Levels and Hue/ Saturation adjustments for example, you can fine-tune each in turn, while Photoshop's Layer Style commands allow you to independently manage multiple interacting style effects from a single central dialog.
**What's needed is some way to keep the stack of filter operat
ions live so you're able to experiment with different settings to produce the best possible results. This is exactly what the UK company Fo2PiX promises from its artistic buZZ.Pro 2 plug-in filter (<a href="http://www.fo2pix.co.uk" target="_blank">www.fo2pix.co.uk</a>). In fact, buZZ.Pro is a collection of 19 filters for applying colour reduction, edge detection, blurs and so on. However, what makes it different is that you don't apply these effects individually, but instead combine them i
n an ongoing stack. To produce its Oil effect, for example, the stack is built up from colour edge detection, emboss and two simplifier effects. At any time, you can reorder this stack, add or delete steps or fine-tune each filter, say, to cut down on the number of colour areas in the image. Each change will produce completely different end results.
**It's possible to save stacks that work well for future use, and buZZ.Pro comes with a selection of 13 presets ranging from Bright Colour Wa
sh through to Watercolour. It works as a proof of principle - demonstrating the huge creative advantages that flow from combining multiple live effects - but has a number of intrinsic and unavoidable limitations. It can be difficult to judge an effect from the tiny preview offered and, while its range of filters is reasonable, there's no way Fo2PiX can ever hope to match the full breadth and depth of Photoshop's existing filters. But, most damaging of all, while each individual filter rema
ins live within a buZZ.Pro stack, the overall effect is still destructive and, once applied, there's no going back to fine-tune it.
**Clearly, a more general solution is needed to answer all of these criticisms, and the answer is staring us in the face. Why not apply the existing Photoshop filters in a non-destructive way, either as adjustment layers that affect all underlying layers, or as layer styles that would affect just the layer to which they're applied? After all, this sort of bit
map-processing power is available in another Adobe graphics application - the vector-based Illustrator.
**One of Illustrator's biggest strengths is that it provides the majority of Photoshop's filter collection on its Filters menu, ready to be applied to any imported bitmap. More powerful still, these same filters can also be applied from its Effects menu. At first, this duplication might seem strange, but applying a filter as an effect has a number of advantages, most obviously that you
're no longer restricted to applying the filter to bitmaps and are able to apply it just as well to vector objects that are automatically rasterised and masked.
**Illustrator's rasterisation isn't destructive as it is within Photoshop, where you have to permanently render a shape layer down to pixels before applying a filter. Instead, the effect is live and works like a layer style so that it's still possible to edit your objects by resizing, repositioning, reshaping, reformatting and s
o on, with the filter automatically re-applied. In short, you have the best of both worlds: the precision and editability of vectors and the creativity and naturalism of pixels.
**Also, because Illustrator's effects are non-destructive, they are retrospectively editable. Select an object that has had a filter effect applied to it and open the Appearance palette (<Shift-F6>), which lists any applied effects along with other formatting options. Double-click on the filter name and its di
alog opens for you to fine-tune the settings. Even better, there's nothing to stop you applying multiple filter effects, each of which is listed in the Appearance palette. Re-order the filters or change settings and the overall effect is changed accordingly. It's possible to create buZZ.Pro-style stacks using any combination of any filter and edit it at any time.
**This is exciting creative power, but the system still isn't perfect. For one thing, Illustrator only allows filters to be app
lied directly to objects, with no option to apply effects to all underlying objects or sections of objects as there is with the lenses in Deneba Canvas - in other words, you can apply filters like layer styles, but not like adjustment layers. It's also important to realise that there's a serious performance hit involved in keeping filters live, especially if you target your effects to produce 300dpi print work rather than 72dpi screen work.
**Hopefully, Adobe is listening and will add op
tions to temporarily switch off or lock effects in the next release of Illustrator. More to the point, I hope the company is working on adding a similar non-destructive approach to managing filters where it makes most sense - in Photoshop. In the meantime, the best way to make the most of Photoshop's bitmap filters is to get to grips with the Actions, History and Layers palettes - and Illustrator.
HJTom Arah shows you how to make the absolute most of your Photoshop filtersL
When you build web applications or dynamic websites, you soon realise, as with any programming, how important thorough testing is. You can do this manually, clicking through the pages and fixing snags as they arise, but with the best will in the world one soon gets lazy and it becomes all too easy to forget to test a particular scenario.
**For example, your site may work well for new visitors, but does it still work the way it should if they've previously registered, have a cookie placed
on their machine and visit again? Perhaps your test machine contains cookies from previous tests: did you remember to clear them out each time?
**If a bug is found, the tests should be run again from the start, but how many of us just assume (hope?) that the fix won't have knock-on effects and continue where we left off? The time pressure of web development is such that testing time can get squeezed, so anything that helps is welcome. There are many testing products out there, but they'r
e often price-targeted at large development houses and many need their own programmers to sort out the scripts.
**It was with this in mind that there was some interest when a copy of QA Wizard (<a href="http://www.seapine.com/qawizard.html" target="_blank">www.seapine.com/qawizard.html</a>) was sent to Mark. This product enables you to record your movements through a website and write them into a script, which you can then edit and customise, for example, to change the variables typed int
o a web form page. Instead of being fixed in the script, these variables can easily be fed from a database, further extending the product's flexibility. QA Wizard scores over many of its competitors because script editing is done via an easy-to-follow user interface rather than at code level. Once your script is recorded and customised, you may run it at any time to test the site.
**Scripts can take a while to run so it's a good idea to build them up to test a specific area of the site a
nd then have a master script call the smaller ones. This is easier to debug and the smaller scripts may be used on their own during development, then the master script run overnight for a thorough test. Once the required number of iterations of a script has run, an HTML page is generated showing the results and any pages that failed. From this page, you can log any errors into another product called Log Server, which assigns bugs to a particular person to fix and follows their progress.
**It still takes time to initially set up QA Wizard so that a thorough testing regime is generated, but the ease with which tests can be run makes it well worth the effort.
**Programs like this are handy when you have to revisit a website because a client wants functionality added. You can then easily retest the site using the previous scripts to make sure your modifications haven't broken anything. At
2,000 plus VAT, QA Wizard is hardly cheap, but it's less than half the price of its c
ompetitors. You could still argue that this is a lot of money for what, on first inspection, appears to be not much more than a macro recorder. However, QA Wizard is much more than a simple macro recorder thanks to the flexibility it offers when customising the built script. The program's cost would be recovered by the time saved doing full tests, and if you can't afford a program like this perhaps you should be asking yourself how long you spend on testing and what that's costing you. If
the sums still don't add up, maybe you should be testing more.
**The final blog
*This month, I reach the end, for now at least, of my series on creating your own Weblog, or blog. Those of you who've enjoyed it so far (thanks for the many emails) will be pleased that you can start putting it all together. The two people who hated it (again, thanks for the emails!) can look forward to 'normal service resumed' next month. Last month in issue 101, we got some data into the blog database, and
this month we'll fetch it back out again as we're going to create the front-end user interface.
**Let's start by thinking about how to display it. A typical blog might have, on average, one entry per day - sometimes more, other times less. You'll probably average around 30 entries per month, and that's a sensible number to view at one time, so we can build an 'archive' view that presents the blog entries on a month-by-month basis - click on April 2003, for example, and you'll see all of t
he diatribes for that month. This isn't suitable, however, as the main front-end view that people see when they first hit the blog: they could arrive on the 1st of a month and find it empty or the 31st and find it full. A better option for the front page is to display the latest X entries, where my preferred value for X would be around ten, but you'll probably have your own preference.
**As before, I'm only going to show you some skeleton code, which you'll have to piece together and add
features yourself, as well as getting out your coloured pencils to make the whole thing look pretty. Let's start by pulling the latest ten entries out of the database and displaying them in a table.
**</b>As always, first make a connection to the database:<P></b>strconn="PROVIDER=Microsoft.Jet.OLEDB.4.0;DATA SOURCE=" <P></b>strconn=strconn &"c:\webdata\xyz\blog.mdb"&";" <P></b>set connBlog=server.createobject("adodb.connection")<P></b>connBlog.open strconn
*<P>You'll need to change that
path in the second line to point to where you have your database stored. As with the other examples to date, we'll be 'talking' to the database using SQL statements. You probably think that pulling data out of the database will be simple, but in a moment I'll be showing you some of the hairiest and ugliest SQL statements that I've used so far in this series. So, to grab the ten latest entries, you might try something like this:
*</b>strSql = "select top 10 * from tblBlogs order by dtDateTi
me desc"
*<P>This takes the blog table, sorts it using the date and time field, and returns the data in descending order (most recent entries first). The 'top 10' part then simply lops off the first ten entries and discards the rest. You can easily pull that hard-coded ten out and replace it with a variable parameter. To actually use that SQL to extract the data, wrap it up in code as follows:
*</b>set rs=connBlog.execute(strSql)
*<P>This returns a recordset containing the ten records (or
between zero and ten records, as it can't obviously contain more data than exists in the database). To turn the data from this recordset into an HTML table, you need code that looks something like this:
*<P></b>loop<P></b>response.write "</table>"<P></b>Then finally, because we always like to be kind to our servers, we clean up after ourselves: <P></b>set rs = nothing<P></b>connblog.close<P></b>set connblog = nothing<P>
*In theory, you shouldn't have to do that last step, but I've seen web servers fall over regularly until the code was revised to close anythi
ng left open and discard any objects that had been created.
**There's just one problem with the code just shown; namely, that the Poster and Category values will be shown as numbers rather than the real data, because all that you've pulled from the database are the pointers into the poster and category tables. You should remember that we stored the actual names and category details in separate tables to save having to store multiple copies of repeating data in the main blog table.
**The
re are two ways to get the correct data from the database. You could write a function that, given a poster value of x, did a lookup into the poster table and returned the correct name, but the problem with this is that it would impose two extra database lookups for each blog entry. For this sample app, that probably doesn't matter, but it would slow things down in a huge e-commerce engine. A better option is to make the database work for you by modifying the original SQL statement to read
as follows (note that this is all one long line):
**strSql = "SELECT tblBlogs.*, tblCategories.chrCategory, tblPosters.chrPoster FROM tblPosters INNER JOIN (tblCategories INNER JOIN tblBlogs ON tblCategories.idCategory = tblBlogs.ptrCategory) ON tblPosters.idPoster = tblBlogs.ptrPoster"
**I told you that you'd be seeing some hairy SQL this month, but if you walk through this statement you should be able to work out what's going on. It returns all of the fields from tblBlogs, just like be
fore, plus the name fields from the other two tables. Then it creates 'joins' between these two extra tables and the main blog table, where the 'on' clauses show how to glue the tables together.
**</b>Of course, if you get stuck with your SQL syntax, you can always consult the world's best help file - www.google.com
**One thing you need to be aware of - seeing as we're using an Access/Jet database - is that Access can be fussy about multiple joins. Things you might get away with in a g
rown-up database like SQL Server can generate several kinds of (often incorrect) error messages when used against Access. However, there's an easy way to cheat. Simply fire up a copy of Access and create your query using the Design view. Then select SQL view and you should see a nice Access-compatible SQL statement that you can copy and paste into your code.
**Using this statement, we now have the name fields available, so you can change the display code to use the new data:
*That's it for the default Blog view. The other thing you'll need is the Archive view - the mechanism to view everything beyond these top ten entries. I said earlier tha
t a Month view might be nice, and to make it visitor-friendly would be for its navigation to be presented as a list of the available months with the number of entries for each; for example, a series of links like:
*<P></b>Nov '02 [20] | Dec '02 [25] | Jan '03 [10] | Feb '03 [27] | Apr '03 [5]
*<P>How do you generate this list containing only the months that have entries and their count? This SQL does the trick (note this is all one line):
**strsql = "SELECT month(dtdatetime) as m, year(dt
datetime) as y, COUNT(*) as c FROM tblblogs GROUP BY month(dtdatetime),year(dtdatetime) ORDER BY year(dtdatetime), month(dtdatetime)"
**There are a few things to note about this SQL statement. First, it's very Access-specific, as I'm using the embedded VB functions to extract the month and year from the dtDateTime field. If you upsize this to a big boy's database, you'll need to substitute alternative functions. Second, note that I've named the select fields using 'as', because they're al
l returned values from functions. Without naming them like this it would be hard to locate the correct data in the returned recordset. Third, I've introduced a GROUP BY clause that tells the database to bundle up everything for each combination of month and year. Fourth, there's the ORDER BY. Note that this time I'm using year first, then month, as otherwise you'd get Jan 2002, Jan 2003, Feb 2002, Feb 2003 and so on. This only works because the month() function returns the month number (1
to 12) rather than its name, which allows you to sort on it - a function that returned the month name when sorted would give you April, August, December.
**You can use the data (m, y, and c) returned to build your navigation links, and then link these through to pages displaying the entries for the individual months. By now, you should be able to work out the SQL for doing this yourself. And that's it - you now have all of the code you need to build a very basic blog system. Please let me
know how you get on. In a few months' time, I'll show you a few enhancements, such as how to add entries by emailing to your site, and how to upload photos and other images.
**Online for Christmas
*Post-Christmas and January sales is as good a time as any for Mark to reflect on the performance of online shopping versus its more conventional competitor the high street. The media were full of doom and gloom about poor retail performance in the high street, with 6 per cent or less growth be
ing bandied about and everything from bad weather to the events of 11 September being blamed. However, it seems that online traders were having a much better time of it, with growth rates of up to 125 per cent being reported. After all the hype of the dotcom boom and bust, those online traders who are still here seem to be meeting some success, particularly the specialist retailers.
**These sites aren't on the scale of Amazon or Tesco, and often use off-the-shelf server products like Act
inic to build their e-commerce websites. They usually supply products that you'll not find easily in the high street, and it seems that more and more people are turning to the Internet to buy such items. Actinic, a UK developer of one of these off-the-shelf e-commerce packages, states in a press release that its customers are reporting an average increase of 66 per cent in sales, selling stuff as diverse as snow sports equipment (<a href="http://www.snowlines.co.uk" target="_blank">www.sno
**According to The Independent, 'British shoppers spent
1bn online in just one month in November... The
1bn spent in November is a 95 per cent increase on the same month
in 2001... The rate of increase for Internet retailing is 15 times as fast as high-street shopping... High-street shops are putting a brave face on a November growth figure of just 2 per cent, the lowest for two years.' This is good news for all of us involved in the business. Online experiences are improving all the time with such excellent sites as Virgin Wines (<a href="http://www.virginwines.co.uk" target="_blank">www.virginwines.co.uk</a>), while many of the high-street retailers see
m to be falling back on 'pile 'em high, sell 'em cheap'. Take a look at the record and DVD stores, for example, which instead of offering more services or advice seem to be concentrating on competing with the online stores with large stocks.
**The bookshops have fought back by adding coffee shops and reading areas, and suggested reading labels. There's no way a high-street store's product range can compete with a global warehouse offering online sales through a comprehensive search engin
e. People will now often pay slightly more for the convenience of online shopping, but go into a high-street music store to try and find something different to listen to, or get help.
**Visit a website like Virgin Wines and order some wines, then the next time you log in you can rate the wines you've bought and mark specific ones so you don't forget them. This builds up into a list of your favourite wines, and the system can recommend others you may like, while the company is getting pri
celess feedback on its products. Mark can't think of any major retail chain that takes this much interest in its customers.
**Playing to the strengths of each medium is what will result in success, but the game of 'catch-me-up' rarely will. Therefore, it's good to see the online shops doing this and all those earlier attempts to mimic the high-street store with virtual 3D stores being consigned to the bin... for the time being.
HXPerfectionist Mark Newton insists on thoroughness, while Paul Ockenden finally blogs offL
It's probably snowing now, the frost is thick and deep, and you might be thinking that hibernation is a good idea. Unfortunately, I'm typing this from the sun-lounger next to the roof-top pool at a hotel in Rio de Janeiro, armed with an indecently large Bloody Mary and extra-dark sunglasses. All these distractions make it hard to summon up the will power to come up with a coherent strategy for Microsoft's product platform as it moves forward. But a number of things have come to light over
the last month or so that demand discussion and consideration, however 'difficult' my surroundings.
**Microsoft has, in effect, laid out its roadmap almost until the end of the decade. There hasn't been a grand unveiling of a Huge Corporate Plan as yet, but bits and pieces are definitely coming out into the open. What's surprising is that Microsoft genuinely appears to have started growing up.
**The message has finally got through that you can't keep linking arbitrary platform changes t
o middleware changes, and thus to desktop changes. In other words, the fact that Office 11 is around the corner is no excuse for tying it into XP Professional or Home on the desktop. In truth, Microsoft hasn't been as guilty of doing this in the past as some would have you believe, but there have been some howling errors on the server side.
**Remember how you had to install IE 4 to get the NT 4 Option Pack - and thus a better release of IIS - onto your core servers? This was because Micr
osoft was unwilling at the time to supply a single security DLL away from the IE 4 installation tree. Then there was the perception that you had to get Windows 2000 Active Directory up and running before you could go to Exchange 2000, which was entirely true. Not only that, it was quite right and proper, due to the huge architectural changes that exist between NT 4 and Windows 2000 Active Directory. It wasn't realistic to expect Exchange 2000 to run on NT 4, however much some extremist cus
tomers might have wanted it to work.
**Microsoft was, however, hugely guilty of overplaying the Active Directory card to those companies that were, at the time, happily sat on NT 4 and the old LanMan-style domain model. By tying Windows 2000 to Active Directory in all of its marketing blurb, Microsoft effectively killed any possibility of people using Windows 2000 boxes as member servers in a happily running NT 4 domain. This high-handedness has come home to roost for Microsoft, though,
as there's still a lot of NT 4 out there and that isn't going to change in the next few months.
**Many customers with the Windows 2000 Server platform, myself included, have been waiting to see what tie-ins Microsoft would impose for the .NET 2003 Server release. Will it be that while you can run the 2000-era server products on Windows 2000 Server, anything with a 2003 or later badge will require .NET Server 2003? This proves not to be the case. In fact, it's so dramatically not the case
that I kept haranguing the relevant Microsoft people with the battle cry of 'where's the but?'
**To my astonishment, there doesn't seem to be one this time. Every significant forthcoming Microsoft server-side or middleware product or platform will run on Windows 2000 Server. You don't need to upgrade your underlying servers, but there may be advantages that become available when you do. For example, there's no doubt that Titanium, or Exchange Server 2003, will run on Windows 2000 Server
with the Windows 2000 Active Directory, but there are new things in .NET Server 2003 Active Directory of which Titanium will be able to make good use. If you don't want the .NET Active Directory improvements but lust after the gorgeous new Outlook Web Access that comes with Titanium, that's no problem - upgrade your Exchange 2000 boxes to Titanium and leave your Windows 2000 Active Directory infrastructure as it is.
**In a move that made me blink, it seems that all of the XML-related mi
ddleware servers have been scrapped. BizTalk has gone, as have Content Management Server, Commerce Server and so on. By 'gone' I mean there are no longer any product groups working on each of these shrinkwraps. Each will be replaced by a component in the forthcoming Jupiter platform from Microsoft. If you have existing licences, these will be converted to Jupiter licences when that platform ships, but some of these Jupiter pieces are going to be on a 2005 or even 2007 release schedule, and
all of them will, according to Dave Wascha, chief guru of that group, run on Windows 2000 Server.
**This might seem scary, but the benefits will be huge. At present, Microsoft has four incompatible workflow engines - one in BizTalk, one in Commerce Server, one in Content Server and the little baby one in Exchange 2000. By forcing all of the Microsoft product groups to use just one, it will have to be best of breed, and - assuming there are no nasty hidden API lockouts - the customer base
will benefit too. How many of you have used the workflow designer in Exchange 2000, for example? Very few, I wager. But that's not the end of the boldness - Dave Wascha has personally guaranteed that all the interfaces necessary to host these engines will be left open. The aim is that you'll be able to use the Microsoft workflow engine inside a competitor's document management system, or use theirs inside the Microsoft offering. That's a bold claim to make.
**Given that platform/ middle
ware tie-ins appear to be a thing of the past then, and that most of Microsoft's customers are now on rolling two-year contracts, there's a point of view which holds that Microsoft should start to issue annually badged releases rather than embroiling us in large service packs that have little meaning to most people. Can you remember whether Exchange Server 2000 SP 2 is bigger, or better than SP 1? I can't - it is no longer easy to keep on top of this stuff.
**Herding Longhorns
*Now for c
odename Longhorn and where that fits into the picture. Basically, it's the next big release of the desktop OS and it will ship in mid/late 2004. There are early alpha releases of it on various warez groups, but all these seem to show is various aspects of the modified look and feel. This is the first real work that's been done on the Desktop's look and feel since the release of Windows 95, unless you count the 'cartoon' aspects of XP Home. Although XP offers a somewhat garish user experien
ce, it can be toned down if you like and you can't ignore the power of the new Wizards for most beginner and mid-level users of the platform. In Longhorn, we'll see more of this and a long-overdue push towards a properly implemented object Desktop.
**However, to make this rock and roll, we need the WinFS, which is the new Windows Filing System. This is the OFS (object filing system) that's been promised for so long and never been delivered: I'm sure some of the thinking goes back to the
Cairo OFS of ten years ago. The idea is a grand unified store into which everything you do gets poured. This isn't SQL Server per se, but it can handle SQL data. Neither is it extensible JET like Outlook PST stores and Exchange Server, but it can handle their data too. Nor is it NTFS, but it handles everything NTFS can do. Well, that's the claim. The client Longhorn product will ship with the WinFS built in, says Microsoft.
**However, there are already clouds on the server-side horizon.
Obviously, there's little point shipping an OFS on the client side unless there's a matching server-side OFS to go with it. I put that question to Microsoft spokespersons recently, and they said there would be some sort of monster patch file, service pack or maybe even an Option Pack for the server-side support of Longhorn. This has been caused by timing: if .NET Server 2003 isn't going to ship until mid-2003, at least in SP 1 form, there's no way Microsoft is going to persuade corporates
to move to a new server-side platform just 12 months later. And nor should they move - that's the job for the Blackcomb server release due in late 2005 or even 2006.
**Microsoft can afford to release the client-side version of WinFS first. It can claim that you'd have to be mad to use a new filing system unless you needed to, and that any data stored on a desktop machine would be, de facto, backed up to central servers too. If Microsoft does the job properly, end users won't notice the Wi
nFS - it will just work like it has until now, only better. By going for the client side first, it can make great use of the almost unlimited untapped power available on the desktop.
**If the new file system is 20 per cent slower than NTFS, will you notice? I bet you won't on your laptop, but you most certainly will on a corporate server hosting hundreds or thousands of users. Similarly, with disk space, if all the new content management functionality consumes double-digit percentages of
your free disk space, will you notice on a laptop that currently has 60GB free? Of course not, but you will if you fire this up on your SAN-consolidated storage farm.
**For Blackcomb, Microsoft has a clearly defined set of targets. It has to provide the management tools and infrastructure to make WinFS utterly compelling. It has to be so good that no sysadmin would want to use the current hotch-potch of storage schemes. You might believe that corporate IT managers will be reluctant to mo
ve and, yes, they will be, but there is a huge win.
**At present, IT managers have to worry about four disaster-recovery (DR) systems: file system; RDBMS data like SQL Server and Oracle; semi-structured streamed data like Exchange Server; and directory data in Active Directory. This means four tested, considered, planned and implemented solutions, each one a major headache. Reduce the count by one and these people will start twitching. Drop it to two and they'll sign on the dotted line.
Go for the Nirvana of a single storage format and they'll rip your arm off as they attempt to install the software. But all of this will require management tools and reporting facilities and proper grown-up ways of 'bending the data cloud' to fit each user without ending up with a runaway collection of links and hyperlinks and hard links and tangled webs within webs.
**So there we have it - a fairly rapid set of changes, almost all of which will build upon the Windows 2000 Active Directo
ry server platform. There is a huge amount of consolidation that will bring important technologies like workflow, commerce pipelines, content management, and object storage right to the front of the stage. I can't say it will be an easy ride, but a new maturity of thinking is emanating from Microsoft. Finally, the 'New Old Guard' of great thinkers, developers and business users who've joined Microsoft over the last ten years are making their presence felt and the era of the Microsoft Child
who joined at 16 and lacks any social skills worth noting is all but over.
**Orange SPV
*I can't let the launch of the Orange SPV phone go past without passing comment. Contributing editor Paul Lynch and the Reviews team have had lots to say, but this is an important platform from a Windows perspective so indulge me for a few paragraphs.
**First, I think the product sucks. It's lacking in testing both of the beta and usability varieties. I don't care how many guinea pigs were used at Re
dmond to do the usability testing on the Smartphone 2002 platform - you can't spot any positive results.
**At first glance, it looks promising with built-in PDA functions, phone capability and a big, readable screen. Then the problems arise: basic phone functionality is unstable in function and use; you can't easily tell if a phone call has terminated or not; the core OS isn't fully tested and can suffer from lockups and freezes; and while the PDA parts work quite well, the basic phone f
unctions are almost laughably naive in their execution.
**Worse still, there are major holes in the way it handles stored data. You'd think it's a common enough scenario for someone to sync their email from a desktop into the phone via the cradle and then to 'top up' this email using IMAP or POP3 when away from the office. But you can't do that - as soon as you try to use a second type of mail connectivity, it insists you can only have one and throws away all the existing mails. I know i
t will work much better when you have Titanium's built-in mobility features available to you, but that isn't going to be an option for most people.
**As if an unstable, untested platform wasn't bad enough, Orange has decided to require developers to digitally sign their applications in a feeble and brain-dead attempt to milk the development community of revenue for the privilege of being able to install their own applications onto their own phone. Such an own goal makes me gasp at its st
iupidity.
**Surely, Microsoft wants and needs all the .NET development community to start writing applications that use the lightweight framework designed for these devices? Who will go down that road if they're going to be nickled and dimed by Orange in the process? The lack of common sense, both within Microsoft and Orange, makes me shudder with disbelief.
H^Jon Honeyball looks at Microsoft's roadmap and goes Orange with anger at the latest smartphoneL
March 2003X
101 (March 2003)i
Advanced Windows
Simon JonesD
Packed off
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+$
O+O+O+O
%U%U%U%U%U%U
+O+O+O+OO+
+OO+O+O+O+O+O+O+O{
*P*P+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1
I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1
I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I
1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+%
O+O+O+O+O+O+
O+O+O+O+
+O+O+OO
OO+O+O+O+O
O+O+O+O+O
+O+O+OO+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+
roamingL
roaring
roast
roasted
roasting
robbed
robber
robbers
robbins
robert
robertgraham
roberts
robertson
robertson's
robinN
robin2
robins
robinson
robinson-coded
robohelp
robohelp's
robohtml
robot
robotic
robotics
robotsS
rocket-poweredw
rockingA
rocksG
rockwell
rockwell-based
rocky
rodney
rodrigues
roeder's
roger
rogueG
rogues
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O
O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O
+O+O+OO+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+$
O+O+O+O
%U%U%U%U%U%U
+O+O+O+
+O+O+OO+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O
OO+O+O+O+O
+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O,N,O
+O+O+O+O+O+O
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+
OO+O+O+O+O
+O+O+O+O+O+OO+
+O+O+OO+O+O+O+O+O+O+O{
+O+O+O+O
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+$
+O+OO+O+O+O+O+O+
+O+O+O+OO+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O{
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O
consumingI
consummate
consumption
contactB
contact-info
contactable
contactbox
cloudsp
cloudspace
clout
clover
cloves
cloying
clr-compatible
clr-compliant
clrgeeks
clsfruit
clsid
clsws
clubautopc
clubb5
clubphoto
clubs
clubwin
clue~
co-designed
co-editors
co-exist
co-existence
co-exists
co-founderT
co-locationF
co-operate
co-operatingK
co-operative
co-ordinateX
co-ordinated
co-ordinates
co-ordination
co-ownerJ
coachJ
emanatingp
emateD
emates
embargo\
embarkN
embarked
embarking
embarrass
embarrassed
Last month in our 100th issue, I praised Microsoft for learning from past mistakes and making its service packs for Office more reliable, but it looks like I might have spoken too soon. The newly released service pack (SP 3) for Office 2000, while fixing a lot of bugs and introducing one or two welcome new features, can also cause Outlook 2000 to consume 100 per cent of your CPU time, effectively crippling your PC.
**The previous service pack for Office 2000, SP 2, included the dreaded Ou
tlook Security Update and gave you no way to tone down or turn off its effects. Outlook would hide attachments it thought were unsafe; it would question you if any other process tried to look at your address book; and it would make you wait five seconds and press a button if another process tried to send email (even when you were trying to do a 1,000-item mail-merge from Word).
**While Exchange administrators were given tools to control all these settings and Outlook 2002 users could con
trol what attachments were considered unsafe, Outlook 2000 users who didn't use Exchange Server for their mail were left high and dry. They got all the 'security' and none of the control. When Office 2000 SP 2 was released, I published details of how to break the service pack apart, remove the patches for Outlook and apply the rest of them, which got the bugs fixed in the other Office applications but left Outlook 'unsecured' and with over 70 bugs still resident.
**With the release of Off
ice 2000 SP 3, Microsoft has finally given Outlook 2000 users the same control over the list of 'unsafe' attachments, but there's still no control over the address book guard or the send mail guard unless you're using Exchange Server. Office 2000 users were rejoicing nonetheless - at last they could get the bugs fixed in Outlook without completely crippling it. Joy quickly turned to frustration for many users, as they found that after applying SP 3 their computers would grind almost to a h
alt. Outlook would - under various circumstances - completely take over their CPU, leaving no clock cycles free to service any other processes.
**Outlook newsgroups and the MVPs (Most Valued Professionals) that answer users' questions were inundated with messages asking why their computer had seized up after installing the patch. I have it from two reliable sources that the slowdown was noticed during alpha testing of the patch, that the source of the problem was identified and correctiv
e action taken, but somehow that fix didn't make it into the final release of the patch. Someone at Microsoft must have screwed up in final testing in the version control/bug-tracking department.
**The latest information is that the bug only seems to affect users in Internet Mail Only (IMO) mode, so users with Exchange Server won't be affected, as they have to run in Corporate/ Workgroup (C/W) mode. IMO mode users can switch to C/W mode if they're affected by this bug, though they'll not
ice some change in features and functionality.
**Slipstick Systems has a good explanation of the modes in Outlook 98 and 2000 and how to change them on its website at www.slipstick.com/ outlook/choosing mode.htm. If you do change from IMO to C/W mode, you'll lose support for IMAP accounts, WinFax SE and the ever so useful 'Send Using...' command.
**However it happened and whoever's at fault, it appears that the only wholly reliable way to stop Outlook slowing your machine to a crawl -
without changing mode - is to uninstall Office and reinstall it applying only SR-1a. If you apply SP 2, Outlook gets that security update with no control, while if you apply SP 3 you get control of the security update but your PC may slow to the speed of treacle in winter - what a choice.
**Microsoft took ages to say anything officially about this bug, but, as I write, it has just admitted to the problem and says that it's developing a fix. Knowledge Base article http://support.microsoft.
com/ ?kbid=811167 should point to the fix when it becomes available. Note that if you changed mode to C/W to work around the bug, you should change back to IMO before applying this fix. Remember, we're talking only about Office 2000 and Outlook 2000 here and not any previous (Office 95 and 97, Outlook 97 and 98) or later (Office XP, Outlook 2002) versions. They have their own problems and patches.
**Resources
*If you're happy to take the risk, you can find an overview of SP 3 for Office
2000 at http://support.microsoft.com/ default.aspx?scid=326585. This Knowledge Base article gives details of how to get the patch or the administrative update and links to details of what has been fixed in the various Office applications. [Note that Microsoft Knowledge Base articles no longer need a Q before the number. This gives shorter URLs and browsers are supposed to automatically select the correct language for you.]
**If you're a single user, without an IT department, the easiest w
ay to get Office 2000 SP 3 is from the Office Product Updates website at http://office.microsoft.com /ProductUpdates/mainCatalog.aspx. This website will detect which version and revision of Office you're running and suggest all the patches you need to get up to date. If you have an IT department supporting you, you should contact it and ask if/when/how it's going to roll out this patch. If you are the IT department, you should download the administrative version of the patch and use it to
update your administrative installation point for Office.
**I suggest you try SP 3 on a couple of machines to see if they're affected by the slowdown bug before you roll it out to whole departments. Once you've installed Office 2000 SP 3, you can get Ken Slovak's Attachment Options Add-In to control which file types are eaten by Outlook and which you can receive from www. slovaktech.com/ attachmentoptions. htm. This also works with Outlook 2002 (Office XP). It won't override any security
settings set by an Exchange Administrator. If you use Exchange Server and want to get round these file attachment restrictions, you should go and see your Exchange Administrator (Hint: take some cake).
**Delays, delays
*Microsoft has delayed the launch of Windows .NET Server 2003 again - it now won't ship until April 2003 at the earliest. This has had an unfortunate knock-on effect for application developers, as it means that Visual Studio .NET 2003, which had the codename Everett, has al
so been delayed. VS .NET Everett went to final beta in November 2002 and the non-disclosure agreements were rescinded in the middle of November - beta testers are now allowed to talk about the product, and people with Universal MSDN subscriptions should receive the final beta in their next shipment. You still won't be able to buy the product or release applications you've developed with it until April though, as Microsoft's Beta Code licence specifically forbids you from deploying applicat
ions developed with beta versions of its development tools.
**These delays are frustrating for all .NET application developers. Like many other developers, I've been planning and building applications using VS .NET 2003, but release of these applications is now on hold while all the options are considered. We could back-track the applications into the original VS .NET - not always an easy task - or redevelop small applications from scratch in VS .NET or even VB 6. Smaller companies may ha
ve been relying on the cash-flow generated by releasing new applications, and their plans and finances will have to be revised now there's another three-month delay in VS .NET 2003.
**A welcome exception is that applications that target mobile devices such as Pocket PCs, Smartphones and so on are exempt from the delay. These run on the .NET Compact Framework and Microsoft has announced that there will be an early release of the .NET Compact Framework and that developers will be allowed t
o deploy applications built using VS .NET 2003 on those platforms. Details of this Go Live licence should be soon available at http://msdn.microsoft.com/ vstudio/device/, but aren't yet as I write this. The main condition in the Go Live licence appears to be an agreement that you will update your application to use/include the final version of the .NET Compact Framework within 60 days of it finally being released. This move is a little curious and suggests that Microsoft is:
**a) confide
nt that there are very few problems with VS .NET 2003 applications running on the .NET Compact Framework.
**</b>b) pretty sure there will not be many users of these applications developed on beta platforms.
**c) worried that if it doesn't let developers release apps they've been developing for six months or more, they'll get disillusioned with the Pocket PC platform and switch over to Palm.
**Any or all of these beliefs could be true. What is far less clear is why delaying Windows .NET
Server 2003 has to mean a delay to VS .NET 2003 and the .NET Framework 1.1. Certainly, the .NET Server is 'fully integrated' with the .NET Framework 1.1, but the functionality in .NET Framework 1.1 appears to be now settled and .NET Server 2003 is at RC2 (Release Candidate 2). To my mind, Microsoft could release .NET Framework 1.1 and VS .NET 2003 to manufacturing, and any minor tweaks to the .NET Framework 1.1 needed for the release of .NET Server 2003 could be dealt with by service patc
**I've thoroughly enjoyed beta testing VS .NET 2003. It's been remarkably stable for a beta product and the extra functionality compared with the original VS .NET has been very welcome, even though I've not yet used all of it. The user interface is slicker, with many helpful features. Intellisense has been improved with better statement and construct completion. For instance, if you type an Implements statement to declare that a particular class you're writing will implement a define
d interface, VS .NET will automatically write for you all the procedure headers you need to implement that interface, so you just have to fill in the procedure bodies. The Server Explorer, which gives the developer access to any SQL Server databases, has been improved. Virtually all the developer-oriented features of SQL Server Enterprise Manager are now implemented in the VS .NET IDE (Integrated Development Environment), so you no longer have to switch back and forth between a programming
tool and a database development/maintenance tool. There's also much better support for Oracle- and ODBC-compliant databases.
**Developing applications for PDAs and smartphones is much easier in VS .NET 2003. Support for 'Smart Device' projects is built-in, including device emulators and methods of deploying apps to the remote device for testing and for building installation kits for end-user deployment. C++ programmers finally get a forms painter for Windows Forms applications and Java p
rogrammers will probably like the inclusion of the Visual J# language. Upgrades from VS .NET to VS .NET 2003 will be available for $29 (
18), with pricing in the UK yet to be fixed. If you have a Universal MSDN subscription, you should get the Final Beta and then the release version as part of your subscription.
**Holidays in Outlook
*If you run one of the 97, 98 or 2000 versions of Outlook, you may have found that you don't have any public holiday dates for 2003 and beyond. Governments o
nly confirm holiday dates for a limited number of years ahead, even though most people know the usual dates or recurrence pattern of most holidays. Microsoft ships a text file called outlook.txt or outlook.hol with each version of Outlook, which includes the holiday data for many countries for approximately three years from the release date of the application. When you import these holidays, they each become an individual appointment item rather than recurring appointments.
**Microsoft b
eing Microsoft, of course, it managed to get a few of the dates wrong in each file. If you're in charge of a corporate roll-out of Outlook, you can edit the holiday file before it's deployed to remove any data that is unwanted, correct any dates you know are wrong or even insert new dates of your own.
**Any user can add recurring appointments directly to Outlook, for Christmas, New Year, May Day (First Monday in May), Late Spring Bank Holiday (Last Monday in May), Late Summer Bank Holiday
(Last Monday in August) and so on, which will run forever or until you set an end date. If you've not been able to find the 'Add Holidays' feature before, it's under Tools | Options... | Preferences (tab) | Calendar (group) | Calendar Options... (button) | Calendar Options (group) | Add Holidays... (button). As you can see, it's easy to find and I hope you feel suitably abashed...
**Of course, if the government changes its mind about the dates of these bank holidays at any time, you'll h
ave to change the recurring appointments in Outlook by setting an end date for the old recurrence pattern and creating a new appointment with the new recurrence pattern. In the UK, the only yearly holidays that move are Good Friday, Easter Day and Easter Monday. For your information, Good Friday occurs on 18 April 2003, 9 April 2004, 25 March 2005, and 14 April 2006. Easter Day and Easter Monday are, of course, two and three days later respectively. For other holiday dates in other countri\les, you can download a new list from the Slipstick Systems website: www.slipstick.com/calendar/ holidays.htm
HeSimon Jones is reduced to slow-motion with SP 3 for Office 2000, but enjoys beta testing VS .NET 2003L
Given that Microsoft has finally started to make some noises about the short- and medium-term delivery of the new object filing system for Windows, I thought it would be worthwhile looking at a few of the issues that currently matter a lot in the management of networked storage, and how you should be thinking about storage as we move forward to the WinFS era.
**First, let's establish a few truths: putting storage directly into a server is a stupid thing to do. You end up with the complexi
ty of hardware RAID controllers, hot-plug drives plus all the associated cost, and then what happens? You have to put in lots of spare storage because it's expensive to add more during the working life of the product - it's cheaper to make sure you have enough up front, of course.
**Let's consider all that storage in more detail. There's a chunk of it used for standard housekeeping - loading the OS, holding the system page file and so forth - which is pretty much standard across all machi
nes and OSes. Then you need additional storage to hold application data. On a server, this is typically the data used by server-side apps like SQL Server and Exchange Server.
**Now do some simple sums. A 10GB hard disk is more than enough for the OS function unless you have many gigabytes of RAM, probably on 64-bit processors, and thus need to hold the swap file and possibly the system crash dump file. Have you tried buying a 10GB hard disk recently? It's quite hard, as 40GB seems to be t
he minimum. You might want to have this hard disk as a mirror pair, but many sites don't bother. (If you do, look at the simple mirroring capabilities of the software RAID built into Windows 2000/XP before you buy an expensive RAID controller.)
**Let's look closer at that application data storage. How much you need here will depend entirely on the nature of your application. Some places have a few gigabytes of data in their Exchange Server or SQL Server store, while others have over 100GB
of data in Exchange Server alone. Only you know the probable numbers for your organisation.
**So how is storage typically purchased for this part of the equation? Some well-oiled rules apply: go for SCSI, hot-swap and RAID in hardware and you'll end up with medium-sized, high-performance disks and everything will be right with the world. Except for one problem; namely, that your disk storage is an unmanageable mess, scattered shotgun fashion across a number of servers. And you have no m
eans of expanding it in any sensible way, nor of managing it either.
**One solution at this point would be to go for NAS (network attached storage) devices. These small boxes hold some hard disks and an intelligent motherboard, and they mount their disks onto an Ethernet network. Sounds like a good idea, and they can work well, but they suffer from the same problems as direct server-mounted storage. They're hard to manage, have 'hard edges' in terms of maximum capacity and, worse still, a
ll disk accesses are now running across your Ethernet, which could cause considerable network traffic congestion in many organisations.
**The grown-up solution is to go for a SAN (storage area network). Like the NAS, this is a box of hard disks with a controller, but the controller does a lot more work in this case. You can add more hard disks and just 'grow' the storage size automatically. If you have a 400GB SAN store and add another five 100GB hard disks, you'll probably be able see th
e capacity increased automatically to 800GB, assuming one spindle's worth of storage will be consumed by RAID-5 checksumming.
**The big difference is that the connection pipes from your servers to the SAN are typically done via fibre-optic cabling and kept well away from the Ethernet user network. By putting all of your application data onto the SAN, you've pooled the storage into one heap, which is why SANs typically have a much higher percentage utilisation than NAS devices, and certai
nly far more than hard disks shovelled into the servers themselves. Given that you can easily add more disks, you'll probably end up with a 75-90 per cent utilisation of SAN space, compared with utilisations often down at the 20 per cent level for server-attached storage.
*Why does all of this matter for WinFS? Well, by moving over to a new filing system, Microsoft is going to allow, over time, a much richer set of storage facilities than we have at present. It has to do this, as other
wise there's no point in moving. Due to this, you might want to start thinking in new terms about the storage requirements of your company.
**Take a hypothetical example. You're a medium-sized company and you currently have some 400GB of data on your network. That's rising fairly quickly, say, at around 100GB per year at present. So, if you were putting in a new storage facility, what would be a good figure to aim for? Remember that the ability to add storage incrementally will be an imp
ortant factor, but I'd certainly be looking at around 1TB of storage to start with.
**I can almost hear you cry that this is five years' worth and that buying this upfront is too much, and you'd be right if you're thinking conventionally. But we need to think around the problem some more. You have 500GB to play with, of which 250GB is being used by Exchange Server, but you're contemplating moving to Titanium (Exchange Server 2003) next year. When you do that upgrade, are you going to wan
t to upgrade your master database files and not keep the originals online in case something goes wrong? Of course not - you'll copy the data stores first, then do the upgrade safe in the knowledge that you can always undo the upgrade in a matter of seconds by just pointing the old Exchange Server 2000 installation at the old stores. So that's taken us straight up to 750GB.
**Now let's say we turn on Index Server to trawl through all the information that's available to us in those databas
es. These indexes can easily consume 25-35 per cent of the total information size, and I think we've about hit my 1TB mark already.
**So you buy a 1TB solution with a plan to move to 2TB maybe within one year. Most importantly, though, you have to ensure that your backup and Disaster Recovery (DR) solutions can cope with backing up and restoring 2TB in an acceptable timeframe. Many people forget this and end up with the embarrassment of either not being able to do a full DR within an acce
ptable period, or of having backups that started the previous night running into the next working day.
**By installing a SAN, at least all your data will be in one place so you can buy one big backup/DR device and work with it safe in the knowledge that everything of importance is being backed up from the one storage space. At the end of the day, disk space is cheap - if I can buy a 250GB hard disk for under
300 (albeit in a cheap IDE package), there's no reason why a few terabytes of d
isk storage should cost huge sums.
**Once you have a storage and DR solution that seamlessly scales that far, you can enjoy the ultimate pleasure of stopping thinking in terms of a limited store and imagining you have 'unlimited' storage. Put another way, how would you change your business processes, document and email storage and so forth if you had no worries about the storage space available? Is it really a good idea to face your users with a 30MB inbox limit in Exchange Server? Why n
ot give them unlimited space but manage it properly from one place? Move old emails out to subsections of the data store to lessen the load and improve manageability.
*I tend to keep one year's worth of email in my inbox, with its children subfolders dedicated to one client each. Then there's an 'old' folder under each of these, which contains all the older data. So if I want to look for something from the last 12 months, my search is quick and doesn't impact the server unnecessarily. B
ut if I want to find something from five years ago, I know it's still online and can be found, even through an Outlook Web Access window from some clunky cyber caf in west Los Angeles.
**So give it some thought. SAN solutions aren't cheap, but they're not ridiculously expensive either. Bringing all your storage together in one place will enable you to handle the huge storage nightmare of the next few years and look forward to a solution where data storage becomes one big 'cloud', actively
managed by your servers, rather than the current passive data-bucket solution that harks back to the era of the floppy disk.
**Security Matters
*Regular readers will recall that, in issues 86 and 87, I allocated quite a bit of space to security issues, both in this column and in my technical support column. This month, a series of separate incidents and a couple of press releases brought security back to my attention (not that it has ever been that far away). Let's face it, given the num
ber of criminals out there doing their level best to bring down the working infrastructure of the Internet just because they get a kick out of it, coupled to the permanent attempts of the scum who try and find their way into people's systems, security has become a lot harder than it was even 18 months or so ago.
**At that time, I recommended a number of items available for free use at the Microsoft website, including the Microsoft Personal Security Advisor (MPSA). I mentioned that this w
as produced for Microsoft by Shavlik Technologies, also the producer of a rather useful set of applications called HFNetChk (Microsoft Network Security Hotfix Checker).
**The replacement for the MPSA goes by the catchy title of Microsoft Baseline Security Analyzer (MBSA) and is part of Microsoft's all-encompassing Strategic Technology Protection Program. It's also produced by Shavlik Technologies on behalf of Microsoft, so working on the principle that it's always better to see what actua
lly wags the tail I made my way to the Shavlik website to see what was on offer in the HFNetChk area, as that's still the technology that forms the backbone of the MBSA.
**HFNetChk comes in a number of versions, starting with the free command-line tool HFNetChk.exe, followed by the equally free HFNetChkLT - a cut-down version of HFNetChkPro, a product that's not in the least bit free in monetary terms but which might help you breathe a little more freely in security terms. Then there's th
e Shavlik Enterprise Manager, which is built on MBSA, includes HFNetChk and is used not only to report back on security vulnerabilities, but also to make suggestions as to what you need to do to improve your security.
**There's also the HFNetChkPro AdminSuite, which consists of HFNetChkPro accompanied by the Shavlik AccountInspector, providing you with a toolset that examines your systems for account vulnerabilities as well as patch vulnerabilities. Finally, Shavlik EnterpriseInspector c
an handle the inspection of thousands of systems, reporting back with clear and - initially at least - lengthy reports on the parlous state of your security.
**The free versions concentrate on examining your systems for patch vulnerabilities. HFNetChk.exe, the command-line tool, will check Windows NT 4, 2000 and XP for missing patches and provides you with a list when it has completed its probing. As well as the OSes above, HFNetChk.exe will also check for patches, or the lack of them, on
a whole slew of Microsoft products and server technologies, including various flavours of SQL Server, Exchange Server, Internet Information Server and Internet Explorer.
**HFNetChkLT is also free and comes with a neat GUI. It does system scanning as well and incorporates something Shavlik calls PatchPush. This handy feature means you don't have to run Windows Update on every system on your network any more - just let HFNetChkLT get on with the job of pushing patches onto the machines tha
t you designate should receive them. What I like most about HFNetChkLT is that you don't even have to spend ages laboriously installing agents on every machine you want analysed or patched - just install the software on one system and it gets on with its work of updating from one or two machines to the entire network.
**HFNetChkPro is the bigger brother of HFNetChkLT and, as you might expect, shares the refreshing lack of agents and tedious installation work of its younger sibling. In add
ition, it provides you with the ability to push your patches onto a much wider variety of desktop and server applications, including SQL Server, Internet Information Server, Internet Explorer, Exchange Server and more.
**It also overlaps slightly with the application field covered by the EnterpriseInspector in that it stores the reports you receive in a SQL Server database and enables you to design these reports. It also carries out thorough security checks on programs like SQL Server an
d produces a report that shows you just how secure those systems are. It's advisable to be sitting down when you start perusing these.
**The MBSA is also available as part of the Microsoft Security Resource Kit, something I applied for on the Microsoft UK website when it was first announced. This comprises a bunch of 120-day evaluation versions of Windows 2000 Advanced Server, Windows XP Professional, Internet Security and Acceleration Server 2000 Enterprise Edition and Systems Management
Server. Other tools included are our old friend URLScan, with which I had so much fun 18 months ago, and the IIS Lockdown tool. You can apply for this useful beast by following the links at www. microsoft.com/uk/security
*If you're a Norton AntiVirus user, you'll need to disable Script Blocking before launching the MBSA, as otherwise it won't run. To check out MBSA, visit www.microsoft.com/technet/ treeview/default.asp?url=/ technet/security/tools/Tools/mbsahome.asp
**Sadly, at the time
_of writing, the download from that site was broken and refused to install after I'd fetched it, but I was saved by being able to install the version of MBSA that came on the Security Resource CD accompanying the Microsoft Security Resource Kit. I've alerted Microsoft to this fact so it will almost certainly have been fixed by the time you read this.
HaJon Honeyball files away his worries, while David Moss looks at shoring up system vulnerabilitiesL
March 2003X
101 (March 2003)i
Back Office
Dave JewellD
Delphi .NET
O+O+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O+O
+O+O+O+O+O+
O+OO+O+O+O+O+O+O+O+P*P
OO+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+{z
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+
+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O
+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I/1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO
+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O
unmanageabler
unmanaged
unmangling
unmanned
unmap
unmatchable
unmatched
unmaximise
unmetered
unminable
unmirrored
unmistakable
unmitigatedY
unmodified
unmoved
unmutes
unnamed
unnaturall
unnaturally
unnavigable
unnecessarilyr
unnecessaryU
unneeded
nregistered
unrelatedV
unreleased
unreliabilityR
unreliableQ
unreservedb
unresolved
unruly
unsafeq
unsafenclnativemetho
unsatisfactory
O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I
I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O
+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I/1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+Oz
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O{
+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+
Okay, I lied to you. Last month, I promised that this month I'd be talking about Microsoft's new freebie ASP .NET development tool, Web Matrix (<A HREF="http://www.asp.net/webmatrix" target="_BLANK">www.asp.net/webmatrix</A>). I still plan to take an in-depth look at Web Matrix soon, but I've decided to devote this column to using the Delphi for .NET Preview Edition as the scripting language behind your ASP .NET web pages. The reason for this is that half the readers seem to want nothing b
ut Delphi, while the other half only want .NET, and this way I kill two birds with one stone.
**I've previously mentioned that the Preview Edition of the Delphi for .NET compiler is available to anyone who buys the Delphi 7 development system. Although this is just a simple command-line tool, you can easily configure IIS so as to use Delphi as its scripting language. The preview compiler comes with instructions on how to do this, but for those who have relatively little experience with II
S and ASP .NET here's a blow-by-blow account of how to do it.
**First, you'll need to create a new virtual directory for your experiments. To do this, open the Control Panel, double-click Administrative Tools and then double-click Internet Information Services. This will fire up the Microsoft Management Console for IIS configuration. On the right-hand side of the dialog box, you'll see a hierarchical tree-view that lists the available IIS-related entities. Below the root node, you'll find
an entry for your local computer. Below this is another node entitled Web Sites, and below that another node named Default Web Server.
**If you right-click this node, you'll see a Context menu with a New option from where you can specify that you want to create a new virtual directory. This will invoke the Virtual Directory Creation Wizard, from where you may further specify a logical name (alias) for the virtual directory, together with the pathname of the physical directory that's to
contain your web content. You're also able to set up the permissions that dictate what a client can do. For simple messing around with Delphi for NET, it's adequate to set read permission and to grant execute permission for scripts and executables. In my case, I set the virtual directory name to D4NPE (an acronym for 'Delphi for .NET, Preview Edition) and designated c:\d4npe as the physical content location.
**At this point, you've got your logical directory set up, but you need to add s
ome content. When IIS parses an ASPX file (your ASP .NET source code), it consults a special XML configuration file called web.config, which you can locate in the same physical directory as the rest of the content. Code Listing One shows the sample web.config file supplied by Borland as part of the D4NPE package. The important thing here is the add assembly field that references something called DelphiProvider and the subsequent references to compiler language, extension and type.
**Borla
nd supplies a small .NET assembly called delphiprovider.dll which you can conveniently locate in a bin directory off your main content folder - IIS will automatically check this location when trying to load the assembly referred to inside web.config. I won't bore you with a detailed description of what this DLL does, suffice it to say that, among other things, it acts as the 'glue' connecting IIS with the compiler for a specific script language. In the web.config file we've told IIS that t
his language is called Delphi.
**So now you've set up the configuration file, created a bin directory and copied Borland's delphiprovider.dll file into it. The final piece of the jigsaw, as you no doubt suspected, is to create an ASP .NET source file that makes use of all this stuff. To illustrate how this works, I've shamelessly borrowed Borland's editdemo.aspx code, which you can see in Code Listing Two.
**Inside the outer-level HTML tag, you'll find that there's a script section wit
h a language identifier that marks the scripting language as Delphi. This causes IIS to use the DelphiProvider assembly that I mentioned earlier. This is followed by the source code of a simple ButtonClick event handler, which - as I'll show you in a moment - is triggered in response to a button push. In case you're unfamiliar with the .NET Framework, the EventArgs class encapsulates a 'null' event and is frequently passed to event handlers. The Sender argument is of type System.Object and
you have to ensure that this type is qualified by the System namespace, because Object is a reserved word in Object Pascal.
**Below the script section there comes a standard set of ASP .NET control tags that define a textbox called Edit1 followed by a button with the caption 'Click Me!'. Notice how the OnClick handler for this button is pointed at the Delphi event handler. Finally, after a new paragraph tag, a label control called Message is also referenced. Since I've defined my virtua
l directory name as D4NPE, I can try things out by using the following URL: http://localhost/d4npe/editdemo.aspx
*The result can be seen in the screenshot below. Type some text into the edit box, click the button on the right and - hey presto! - the text is copied into the label control below. Not exactly rocket science, but it's still nice to be able to use Object Pascal as an ASP .NET scripting language.
**Flushed with success from the above code, I decided to tackle a slightly more co
mplex example. If you point your web browser at www.asp.net/tutorials/quickstat.aspx, you'll find a large number of runnable samples that show various aspects of ASP .NET. One of these, vb intro7.aspx, demonstrates how to add a WebForms calendar control to a page and determine the currently selected date in response to a button press. (Hint: go to the aforementioned URL, select 'Introducing Web Forms' from the menu on the left and scroll down until you find VB Intro 7.)
**As you'll have a
lready inferred, the script part of this demo was originally written in VB .NET so I translated it into Object Pascal, the result being shown in Code Listing Three. For the sake of brevity, I've shown only the script part of the ASPX file - browse this month's cover disc to find the whole thing. As you can see, the event handler simply links a bunch of strings together, including the selected subject string from the Category combo box and the currently selected date from the Calendar contr
ol. You can see the program running in the screenshot opposite.
**I was slightly stumped by the way in which the Delphi preview compiler complained about the absence of the Date property in the Calendar control - perhaps the old VB code referred to an earlier version of WebForms. However, after consulting the Microsoft documentation and replacing Date with a reference to SelectedDate, everything worked fine. As it happens, I was quite pleased that I wound up with a compiler error part-wa
y through the process, because when this happens it causes ASP .NET to display an error page that contains links to (a) the compiler output that includes the offending error message, and (b) the auto-generated Delphi code that gets fed into the compiler itself.
**This auto-generated code is, of course, produced by the DelphiProvider DLL and it's quite verbose. Among other things, it encapsulates the ASP .NET page as a descendant of System.Web. UI.Page and adds control definitions to this
class that make it possible for the Delphi code to reference other elements of the page as easily as if they were dealing with a VCL form. Thus, in the current example, the page class will include the following protected declarations:
*These goodies are automatically initialised by a number of other auto-
generated private methods of the page class. To see all this stuff for yourself, take a look at the file x3ymjaya.0.pas, which contains the generated Pascal source (this time without a compilation error in it). You can find it on this month's cover disc. As you've doubtless guessed, the aforementioned filename was also auto-generated. If you're wondering where these files are stored, look below for the following directory location: D:\WINDOWS\Microsoft.NET\Framework\v1.0.3705\Temporary ASP
.NET Files\d4npe\
*As I indicated previously, d4npe is my own virtual directory name - you can call it something else. There are at least a couple of other levels of subdirectories below this, whose folder names are made up of eight hex digits. I'm not sure of the significance of this directory structure, but I imagine ASP .NET uses distinct directory trees for different ASPX files and pages within a particular virtual directory.
**ASPx of the Grid
*Since I'm majoring on ASP .NET this mo
nth, I thought you might be interested in ASPxGrid, a new product from Developer Express. (www.devexpress.com). As its name suggests, this is a fully featured WebForms control written in 100 per cent C# code, which may be directly incorporated into your web apps through the magic of ASP .NET. Like other grid controls available from Developer Express, you can sort the grid against any particular column merely by clicking the relevant column header. You're also able to group data against an
arbitrary number of columns simply by dragging column headers to the 'parking area' immediately above the headers.
**For example, imagine a hypothetical grid displaying all your employee data; dragging the 'Site' column header to the parking area will immediately sort all the employees according to the site where they work. Drag the Room Number header onto the parking area too and you'll have an employee view that's sorted first by site and then by room number within each site.
**Yes,
we've been able to do this kind of thing for some time with Developer Express' VCL-based grid component, for example, but being able to do exactly the same thing from a web page is an almost surreal experience. The WebForms implementation even includes those famous little green 'tracker' arrows that indicate what will happen when you move or rearrange column headers.
**There's a built-in data navigator control that simplifies navigation and grid editing. It includes one-click buttons for
jumping to the first row, previous row, previous page as well as next row, next page and last row. From here, you can change the page size (number of rows displayed) and insert, delete or edit a particular row.
**ASPxGrid also includes integrated support for simple statistical functions such as COUNT, AVG, MIN, MAX, and SUM, which makes it possible to implement simple reports without writing any code. As with all Developer Express .NET controls, full source code is included in the packa
ge, which is reasonable considering that the whole thing is available for $150 (
97). Internet Explorer 4 (or higher), Netscape Navigator 4 (or higher) and Mozilla are all supported. To see ASPxGrid in action, check out www.devexpress.com/products/net/ aspxgrid/demo.asp where you'll find a number of sample ASP .NET pages that have been set up to show the various capabilities of the grid control.
**I've believed for some time that ASP .NET heralds a rapid and exciting convergence of deskt
op and web application development technologies. These days, web users are coming to expect a far higher level of rich interactive content than was the case when programmers were limited to working with raw HTML. We're quickly moving to a point where the naive user wouldn't be able to distinguish between a desktop application and a web application were it not for the fact that it's running in a browser. Of course, this is being partly driven by the increasing availability of broadband, des
pite the fact the UK is still lagging woefully behind in this respect.
**But what's especially interesting about all this from a development perspective is the convergence of programming models used. In the past, I've mentioned the excellent IntraWeb framework produced by Atozed Software (www.atozedsoftware.com), which makes it easy to develop web apps with Delphi using a RAD-based paradigm. You drop components on a form, set properties, write event handlers and so on, just as if you wer
e creating a desktop application. ASP .NET puts the same capabilities into the hands of C# and VB .NET developers, all the more so with the advent of powerful third-party components such as ASPxGrid.
**In the same sort of way, the Delphi Provider DLL for ASP .NET, which I've discussed above, hints at similiar possibilities for the forthcoming Delphi for .NET product. If Borland gets it right, it will be possible to program for the desktop or the Web using the same methodology. Borland un
fortunately didn't get it right with WebSnap, but Atozed did get it right with IntraWeb. Let's hope that a few lessons have been learned in Scott's Valley.
**In a recent interview with Don Box (the ultimate COM guru), he was asked about his future plans. He said that after spending so much time working on the internal behind-the-scenes plumbing, he was looking forward to getting back out into the sunshine and writing some nice apps again. I suspect that by the time Don surfaces from the c\Patacombs, there'll be little to distinguish between web and desktop programming.
HdDave Jewell looks at how the Delphi for .NET compiler can be used as a script language with ASP .NETL
It would appear from looking at the unwanted contents of my inbox that the latest batch of dotcom entrepreneurs has wised up to the fact that one of the few 'almost guaranteed' ways to make money from the Internet is by dealing in the illegal or the barely lawful; that is, making available through the Web what is not, for good reason, easily available on the high street.
**The high-street vendor refrains from dealing in such goods and services for the obvious reason that the police, local
authority or other law-enforcement agencies are likely to prosecute and close the enterprise down.
**The online supplier, however, seeks to exploit three particular and recurrent features of the Internet. First, the trans-national aspect of the Internet allows a supplier to try and escape enforcement in a particular jurisdiction. Second, the relatively anonymous nature of the Web can allow a supplier to disguise their location and identity, making enforcement additionally difficult. Fina
lly, a consumer who has parted with their credit card details may find the enforcement of the contract for goods or services difficult when nothing arrives or access to an online service is denied. This latter is, of course, also partly due to the anonymous and trans-national aspects of the Internet, but in addition because, as a matter of public policy, national laws rarely allow the enforcement of contracts for the acquisition of an unlawful item or service.
**From a brief and non-scien
tific examination of my inbox, it seems that the most common barely lawful or illegal online service is the provision of pornography in all its forms. Coming a close second, however, is the supply of drugs - in particular prescription drugs - via numerous online 'pharmacies'. Leading the pack of readily available but illicit online drugs has to be Viagra. Only marginally less popular are various drugs designed to assist weight loss (Xenical, Phentermine; Meridia; Bontril; Ionamin and Adipe
x), depression (Prozac) and hair loss (Propecia). Performance-enhancing drugs used by athletes and body builders are also readily available.
**Although the anonymous nature of the Internet makes identifying the physical location of an online drug supplier difficult, it would appear from the content of their sites that the majority of online pharmacies are in the US.
**Regulation of the online sale of drugs in the US is complex, as there's legislation at both federal and state level. At
the federal level, enforcement is the task of the Food and Drugs Administration, or FDA (www.fda.gov), which provides useful guidance to US citizens and overseas purchasers about buying prescription drugs online at www.fda.gov/oc/buyonline/default.htm
*It's permissible in the US to sell prescription drugs online legitimately. The pharmacy running the online business must be registered and licensed and the consumer must either provide a valid prescription by fax or mail, or hold a detailed
online discussion with a properly qualified doctor, who will then issue a prescription.
**Illegitimate businesses will invariably circumvent these requirements - some sites sell drugs without a prescription, and any online consultation, if it takes place at all, often consists of no more than a questionnaire. The inherent dangers in such purchases are clear - for example, not every male is a suitable candidate for Viagra, especially if they have a heart condition. Mixing drugs with pre-
existing medication may also be highly dangerous.
**However, it's difficult for UK authorities to take effective enforcement action against US-based online pharmacies. The FDA could, of course, be invited to assist on an agency-to-agency basis, but there's no trans-national legal obligation on the US agency to take any enforcement action. An aggrieved UK citizen can, if they wish, submit their own complaint direct to the FDA via the Administration's site. Typically, enforcement in relatio
n to overseas supplies of drugs to UK citizens takes the form of UK Customs seizing the incoming mail and forfeiting the contents, but it is, of course, the consumer and not the supplier that loses out.
**In the case of online 'pharmacies' in the UK, action can be taken by the UK's Medicines Control Agency (MCA) under the Medicines Act 1968. The MCA (www.mca.gov.uk) is empowered to regulate the sale of prescription drugs. If the online supply of drugs involves non-prescription drugs such
as cannabis, heroin, cocaine or speed, regulation is principally via the Misuse of Drugs Act and enforcement is more likely to be a matter for the police.
**The site www.mca.gov.uk/ourwork/enforcemedleg/prosecutions.htm contains details of the recent prosecutions brought by the MCA. It's interesting to note that the majority of recent prosecutions appear to have been brought against vendors of Chinese herbal remedies, which have, in fact, contained powerful synthetic drugs. The MCA has a
lso brought a number of prosecutions against UK online suppliers of prescription drugs, although such cases can be counted on the fingers of one hand.
**For example, on 24 April 2001 at South Western Magistrates Court in London, John Williams pleaded guilty to one count of selling sildenafil (Viagra) otherwise than in accordance with a prescription given by an appropriate practitioner, contrary to sections 58(2) and 67(2) of the Medicines Act 1968 (see Medicines Control Act and the Online
Sale of Prescription Drugs). Williams, a director of Claremont Ltd, a company he operated from his home and which had been advertising the Viagra for sale on the Internet, was sentenced on 5 June 2001 at Kingston Crown Court. On sentencing Williams, Judge Binning commented that the matter was serious because Viagra may be medically inappropriate for those purchasing and using it. Although he considered a custodial sentence, the judge took into account the small number of tablets sold (fou
r) and instead imposed a
5,000 fine and ordered the defendant to pay the total prosecution costs of
4,587.40.
*More recently in 2002, the MCA brought a successful prosecution against an online seller of Viagra in Stafford Crown Court. The vendor received a 12-month custodial sentence and was ordered to forfeit a substantial sum, which was estimated to represent the proceeds of his criminal activity. Such a limited number of prosecutions is clearly insufficient to deter the considerabl
e number of UK-based online suppliers of prescription drugs, although the prison sentence in the Stafford case is an interesting development. Only time will tell whether further prosecutions and stiffer sentences will discourage others from setting up similar illicit businesses.
**more online law
*Two months ago (see issue 99), I looked at the site run by LawZONE (www.lawzone.co.uk) and its facilities. LawZONE is a free site financed by advertising and the promotion of services relevant t
o lawyers, but its business model must be successful, since imitators are now appearing. Most of them, however, move us out of the territory of e-publications that are of general interest and into areas that are principally of interest to practising lawyers. Indeed, some of the resources mentioned below will require you to establish your credentials as a lawyer before even allowing you to subscribe.
**Infolaw produces a free monthly newsletter that claims to provide information on new sit
es and developments across the UK legal web, as well as analysis and comment on established high-value web resources for lawyers. It offers tips on how to get the most from the legal web and a summary of new resources added to the Infolaw site. Subscription is by way of registration on the Infolaw website www.infolaw.co.uk
*The Lawyer, a magazine aimed principally at commercial lawyers, has been around in print form for many years and now produces a weekly email newsletter - the imaginativ
ely entitled Lawyer News Weekly. The Lawyer's site is at www.thelawyer.co.uk
*The Law Society's Gazette, the magazine published weekly by the governing body for solicitors, can be inspected in its online form at www.lawgazette.co.uk. It contains easily digestible news items, plus more in-depth analysis of current legal topics. It's a bit dry, but could be interesting to the layperson as a revealing insight into the turmoil currently besetting the legal profession. The Law Society itself ca
n be visited at www.lawsociety.org.uk. This site includes information on finding a solicitor, qualifying as a solicitor and complaining about a solicitor.
**While on the issue of online versions of print publications, I should mention The Times Law Reports at www.thetimes.co.uk and the legal news section of The Independent at http://news.independent.co.uk/uk/legal/. There's currently some controversy surrounding The Times Law Reports, which have long been regarded as a valuable resource.
The newspaper has cottoned on to this fact and is seeking to make this aspect of The Times website available to subscribers only. At the time of writing, the issue hadn't yet been fully resolved.
**As the tussle over The Times Law Reports illustrates, detailed legal information is a valuable commodity, which is why most of the truly detailed sites are only accessible after the payment of a subscription. Such sites are likely only to be of interest to a practising lawyer, but examples can
be found at www.lexis.com, www.butterworths.com, www.westlaw.co.uk and www.lawtel.co.uk. Lawtel and Westlaw are both sites maintained by the well-established legal publishers Sweet and Maxwell, while the Butterworths and Lexis sites are both associated with rival legal publishers Butterworths. Finally, if you want to order a legal textbook, specialised suppliers can be found at Sweet and Maxwell (www.smlawpub.co.uk) and the Butterworths site, and don't forget that legal textbooks are ofte
UCn also available at non-specialist online retailers such as Amazon.
HQAngus Hamilton looks at the regulation of the online supply of prescription drugsL
March 2003X
101 (March 2003)i
Legal
Paul lynchD
'Free' OPL
I haven't written much about Symbian lately. Although there has been much activity from the companies that all once belonged together as parts of Psion, almost all of it has been the sort of internal rumblings that happen within any such complex inter-relationships. This is all part of business, but some of these internal stories are more interesting...
**Teklogix, the US company acquired by Psion a couple of years ago, already had a good business based around Windows CE custom applicatio
+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O{
+O+O+O+O
O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+GO+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+z
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%
U%U%UI1I1I2H2I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
%U%U%U%U%U%U%UI1I1I2H2I1I1I1I1I1I1I
1I1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O
+O+O+
O+O+O+O+O
%U%U%U%U%
U%UI1I
+O+O+
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+
+O+O+O+O
O+O+O
O+O+O+z
+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+O
%U%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+GO+O+O
%U%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+O+O+O
%U%U%UI1I1I1I
+O+O+O+O+O+OO+O+O+O+O+O+O+O+O,N
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O
+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O
O+O+O+O+O+O+{
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+
+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+OO+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
%OO+O+O+O+O+O+O+
O+O+O+O+O+O+O+O
O+O+O+Oz
O+O+O+O+O+O
+O+O+
+O+O+O+O+O+O+O+O+O+O+O+OO+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+GO+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
+O+O+O+O+O+O
+O+O+O+O+O+
+O+O+
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
+O+O+O+
O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O
O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O
O+O+O+O+z{
+O+O+O
+OO+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O+O
+O+O+O+O+
O+O+OO+O+O+O+O+O+O+O,N,O
OO+O+O+O+O+O+O+O
+O+O+O+O+O+O+OO+O+O+O+O+O+O+{z
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O
+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+OO+O+O+O+O+O+O{
+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O
+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+OO+O
+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O
O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O
+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+OO+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
regulatedL
regulates
regulating
regulationt
regulations
regulatorb
regulatory
regurgitated
regurgitates
regurgitation
regwrite
rehabilitateda
rehabilitation
rehash
rehashed
rehearse
rehearsed
reinforcedH
reinforces
reinforcing
reinsertD
reinstallL
reinstallation
reinstalledD
reinstalling
reinstalls
reinvent
reinventing
reinvestment
reiterate
O+O+O+O+O
O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O
+O+O+O+OO+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+
imagineA
imagine
you're
cheese
major
global
company
imgname
imitate
imitationF
imitator/killer
tutorialW
tutorial-style
tutorialsU
tuvalu
tuxedo
tuxedos
tv-based
tv-like
twin-cpu
twin-floppy
twin-head
twin-output
twin-processor
twinges
twins
twips
twipsperpixelx
twipsperpixely
twirl
twist[
twistedA
don'tt
ns and devices, but took over Psion Enterprise with good grace and continued to make the enterprise-focused machines from Psion. Now I understand that it is no longer a Symbian licensee.
**As I reported over a year ago, Teklogix took a decision not to upgrade the operating system of the netBook from ER5 to ER6, which makes reasonable business sense as Symbian narrows down ever more tightly on the smartphone market, with the Sony Ericsson slab of that market its preferred domain. This und
erlines the decision made by Psion to gradually back out of the PDA business and quashes any final hope for Psion devotees that a new version of their much-loved PDA will ever be produced.
**If any single aspect of the old Psion machines could be held up to illustrate the reasons why so many people bought and loved them, I'd have to say it was OPL, the interpreted programming language all these machines supported. Psion tried many times to move away from OPL, with semi-support for Visual
Basic and the eventual release of the C/C++-based SDKs, but OPL facilities were finally released and widely distributed for most Psion devices, even the Nokia 9210 series.
**OPL is remarkably neat and efficient on Psion hardware and very attractive to the hobby programmer. As such, it was probably directly responsible for the strong third-party software support for the entire Psion range. If I were to attempt to describe OPL, I'd call it an advanced Basic without line numbers, but devot
ees would go a lot further.
**Symbian is intending to release OPL as an open source project, making all the source code for OPL accessible to developers. It should become available for the latest Symbian smartphone models, the Series 60 and UIQ devices. UIQ is the basis for the Sony Ericsson P800, and Series 60 is the Nokia-based UI used in the Nokia 7650 (OPL is, of course, already available for the Nokia 9210 Communicator family).
**Some excellent apps written by enthusiasts are still
around for older Symbian PDAs and have shown their worth in ports to the Nokia 9210. It will be good to see this community of developers make its way onto the latest Sony Ericsson and Nokia smartphones.
**Cat Fishing
*Paul Ockenden - who writes the RWC Web Business column - and I both use the same old model of Compaq iPAQ and he mailed me the following useful tip for dealing with dusty iPAQs. Anybody who regularly uses one of the early iPAQ machines (the 3300 and 3600 series in particula
r, but also the 3700 family) will be aware of the problem of dust getting inside the screen. There seems to be a kind of magic that makes the dust particles appear much bigger than they really are. In particular, when using the machine in darkness with the backlight on, these specks of dust perform a good impression of the North Star.
**If you tap the screen, you can move the dust around, but can't get rid of it. Fear not though, as there's a solution to this problem, and you'll be please
d to hear it doesn't involve taking the machine apart.
**What it does need, however, is a donation from your cat. If you don't have a cat, try visiting your oldest auntie, as they always seem to have cats. The donation you'll need is a whisker - the longer the better. I'm sure this goes without saying, but please don't go pulling whiskers out of your cat - to do so would be incredibly cruel. You need to keep an eye on the carpet and other places where the cat sleeps, looking for whiskers
that have been shed naturally.
**Once you have one, you're ready to go fishing. Remove the stylus from your iPAQ and switch on the backlight. If you look through the stylus hole, you'll see a gap towards the front of the machine with light shining through it - this is the 'dust fishing hole'. Take the whisker and poke the thin end of it through that hole. You may need to jiggle it a bit, but you should find that you're able to poke it between the LCD screen and the protective glass. Onc
e it's there, with a bit of twisting, poking and a certain amount of good luck, you'll be able to make contact with the dust particles. They should be attracted to the thin end of the whisker by static, so once you've 'hooked' a dust particle simply (and carefully) remove the whisker and wipe the dust off on a tissue. With a bit of practice, you'll be able to maintain your PDA in an almost spotless condition.
**(NOTE: no cats were harmed in the preparation of this article.)
**EQ Pocket
*EverQuest is a games phenomenon that chews up far too many hours of otherwise potentially productive time on Windows desktop computers. As such, it's perhaps only fitting that its maker, Sony Online Entertainment (http://eqpocket.station.sony.com), should seek to occupy the minds of its 400,000+ user base when they're forced to be away from their desktop games machine, by releasing EverQuest for the Pocket PC.
**This simple hack-and-slash combat adventure game, along the lines of many s
uch for both the Pocket PC and Palm, has been given an EverQuest flavour. It isn't in any way a true MMORPG (massively multiple online role-playing game) like EverQuest, and it's doubtful that a MMORPG client for a game of the scale of EverQuest could ever be crammed into a PDA. It isn't a substitute for the resource-hungry PC desktop version either. How can a PDA game even resemble a full-scale MMORPG - can it replicate the atmosphere in any way? And, most important of all, how does it pl
ay as a game in its own right?
**Episode 1 is available now for $20 (
13), with two more episodes planned. The game includes four race/class combinations (dwarf warrior, wood elf druid, human wizard and high elf magician), from which you can choose one and take it as far as level 12 (as opposed to level 65 in the PC game). The general topography is loosely familiar: it's based around Freeport, with locations such as Nektulos Forest, Neriak, Befallen and the Deserts of Ro, as well as new l
ocations like the ruins of Nordok. Later episodes will encompass more of the massive map of EverQuest.
**Spellcasters can have four spells memorised out of eight in their spell books, as opposed to eight out of hundreds, and inventory and character statistics are similarly cut down. These changes are, to my mind, to be expected: a 64MB Pocket PC isn't a multigigabyte desktop PC so something has to give. One constant feature of EverQuest for the PC is that the standard view is a first-per
son one and, although third-person views are possible, the majority of players stick with the first-person most of the time as it allows them to identify with their character and the world they're in.
**EverQuest for the Pocket PC is a single-player, isomorphic hack-and-slash game, which takes about 20 hours to complete (I hear). It uses a third-person isomorphic view, rather like Diablo. Unlike its parent, it offers automatic mapping and lets you look through walls as you get close, whi
ch is a useful warning of approaching nasties. The simple combat system of EverQuest is preserved - all you can do is auto-attack and cast spells. As there are no other players to work with, spellcasters can cast while in close combat and won't have their spells constantly interrupted by melee hits. Mobs (as the non-player enemies are referred to) are ranked in danger by a colour scheme identical to parent EverQuest's, indicating the relative level of the player versus the mob.
**There's
also a loose storyline, based around a simple and very linear quest, which requires you to locate and talk to a few NPCs. Again, like EverQuest for the PC, the quests are pervasive but not very strong. One of the biggest criticisms of EverQuest for the Pocket PC is that the quest conversations are too simple, direct and linear. For a Pocket PC game, the maps are huge, and walking back and forth between quest givers gets boring quickly.
**Installation is via the usual download, but with a
couple of catches. There isn't a MIPS version, only ARM (which means it plays on XScale processors as well), and the installed size is roughly 6MB so make sure you have enough space. Older Pocket PCs may be missing a file that isn't included in the distribution - gx.dll. If you have problems launching EverQuest for the Pocket PC, download GAPI 1.2 from www.microsoft.com/mobile/pocketpc/downloads/devdownloads.asp and extract and install gx.dll in your Windows folder. The developer's websit
e is at www.emodiv.com
**EverQuest for the Pocket PC preserves some of the flavour of EverQuest PC, especially for low-level players, despite the major differences. It might amuse a dedicated EverQuest PC player for a short time, as they play spot the similarities, but I'm not convinced it offers enough of the real hooks of EverQuest to satisfy an addict. As a Pocket PC role-playing adventure it isn't bad, judged on its own merits. There are some weak spots, but if these are fixed for the
T[ next episodes it promises to be an enjoyable way to spend $20 (UK prices to be announced).
H`Psion devotees will jump for joy at news that open source OPL could soon arrive, says Paul LynchL
March 2003X
101 (March 2003)i
Mobile Computing
Brian heywoodD
Reason to be cheerful
One feature most music or audio software packages have in common is that traditionally they tend to look like computer programs. This isn't particularly surprising, since that's exactly what they are, but it can often hold back the creative process as you try to grapple with whatever user interface the software developer has chosen to impose upon the task. This is especially true if you're already familiar with the hardware equivalent you might use to do the job in the studio, and for whic
interactingk
interactionR
interactionsn
interactive[
interestingF
teklogixu
tektronix
telco
telcos
tele-education
telecom
telecom's
telecomms
telecommunication
telecommunications
telecommute
telecommuting
telecomsY
teleconferencing
telegraph
telehouse\
telemarketing
telemedicine
telephone-number
telephone/isp
telephoned
telephones
telephony
teleport~
telescopeb
teleshopping
telesto
telesyn
teletubbies
licensingE
licensing/copyright
lida/mp3
lidin
lidsb
liechtenstein
lieds
liesH
+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O
+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1
I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I
1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I
1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+O
O+O+O+O+O+O+O+O
%U%UI1I1I1I1I1I1I1I1I1I
+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
much-lovedu
much-maligned
much-modified
much-neededH
much-neglectedH
much-reduced
much-requested
much-revised
much-simplified
much-touted
much-trumpeted
much-used
much-vaunted
muckingZ
muddied
muddle
muddled
muddya
muesli
mug's
mugged
mugshot
mugshots
mulberry
mulder
muldiv
mulinux
mullet
mulling
mulloy
multi
multi-border
multi-box
multi-cdX
multi-client
multi-code
multi-column
multi-cpu
multisite_
multitask
multitaskingT
multithreading
multitrackX
multitudem
attractedu
attracting
attraction
attractions
attractiveD
attractive-looking
attractively-priced
attractiveness
attractsw
attrib
everyB
borlands
borland'ss
borland-watchers
borland/corel
borland/inprise
bornH
born-again
borne
borrow
borrowedF
borrowingI
borrows
bosch
bosom
bossG
roundY
round-cornered
round-robin
round-terms
round-trip|
round-tripping
rubble
rubiconW
rucksacks
rudeJ
rudimentaryI
rudiments
ruggedM
ruggedisedM
ruined
ruinsu
h this software is standing in.
**This isn't merely a case of Techno-Luddism, but a recognition of the fact that a hardware user interface developed over many years of practical experience might actually be the most suitable for that application. And even where this isn't the case, the mere fact of being presented with a familiar control paradigm can get you working sooner so that you're able to start producing some useful results, which after all is the point of the whole exercise. As co
mputers become more powerful, there's no reason why software applications can't come with a number of user interfaces, which would mean users could pick and choose the way they interact with the software, rather like changing the 'skins' of your media player to suit your mood.
**There's a reason for this preamble, and that reason is Reason. No, I'm not going completely mad. For a number of years, Swedish developer Propellerhead Software has been beavering away with a modular sound generat
or and loop-based sequencer called Reason that sits alongside the rest of its product range, which consists of ReBirth, ReCycle and ReWire. Reason is now up to version 2 (for both PC and Mac) and it takes the retro-hardware interface concept to its logical extreme by closely modelling a studio rack full of equipment, complete with patch leads in the back. And apart from looking cool, it works.
**The application falls neatly into two sections: the 'virtual' rack and the integrated sequence
r that drives the hardware to produce the actual noise. Starting with the rack, it has a number of sound modules, effects units and a pattern generator that can be plugged into one of 14 channel stereo mixers. I've been using version 1 of this software with the Subtractor Analog Synthesizer, a NN19 Digital Sampler, a Dr.Rex Loop player and a Redrum computer augmented by Malstr
m Graintable Synthesizer and a NN-XT Advanced Sampler in the latest offering.
**The outboard effects consist of
RV-7 Digital Reverb, DDL-1 Digital Delay Unit, D-11 Foldback Distortion, ECF-42 Envelope Controlled Filter, CF-101Chorus/ Flanger, PH-90 Phaser, COMP-01 Compressor/Limiter and PEQ-2 Two Band Parametric EQ. The mixer is a 14:2, with each channel strip (that is, input channel) having treble and bass EQ controls, four auxiliary sends, pan and a channel fader. You're able to set up multiple instances of the mixer and, indeed, any other module, which you can chain together to give you as many i
nput channels as you need, up to the limit of the computing power available on your PC. The list is rounded off by a Matrix Pattern Sequencer, which doesn't make or control any sound but can be used to drive the other modules.
**The point of listing all the available devices is to show you just how comprehensive the facilities are in this software. If you had the same amount of equivalent audio hardware, you'd have the basis of a pretty powerful MIDI programming suite or small studio. The
graphical interface presents these sound modules as they'd appear in a real studio's 19in equipment rack, with all their controls being altered using the mouse. This mouse/knob control interface creates the usual bottleneck, but you can assign both keyboard and external MIDI controller shortcuts to control the various modules instead.
**The sequencer section of Reason pretty much follows the now standard user interface - a variation on the Cakewalk/Cubase theme. However, the creation of
sequencer tracks follows the hardware setup rather than the other way round, so whenever you add a sound generator to the rack Reason will create a new sequencer track automatically. You can then use this sequencer track to record or program the MIDI information to make the selected sound module do its tricks. This track not only controls the musical performance, but also allows you to automate the controls on the sound module. So, for instance, using the analog synthesizer module (SubTrac
tor), you're able to include features such as dynamic filter sweep and resonance control as part of the sequence.
**Unlike most computer-based sequencers, the way the various components are connected is defined explicitly. Not only that, but you can re-route the virtual signals simply by re-patching the virtual cables between the modules. Hitting the Tab key on your keyboard displays the back of the rack, with the connections shown as cables between the outputs of the sound modules and th
e inputs of the mixer and other devices. You can reconfigure the signal routing by dragging the connectors with the mouse, and you may also cross-connect CV (control voltage) signals between modules and so on. This might sound complicated, but the beauty of the system is that, whenever you add a new component to the rack, it gets automatically wired up in a sensible configuration, which means you may never need to use the facility to re-patch modules. But if you do, everything is more or l
ess at your fingertips. What's more, you can start at a simple level by experimenting with the standard configuration, gradually becoming more sophisticated as your knowledge increases.
**Reason does have its limitations, though. I guess it's really designed as a standalone application and, as a result, its integration options are limited if you want to use it in a more traditional studio environment. For instance, it doesn't have any time-code facilities (SMPTE/MTC), which means it could
be difficult to use for creating video soundtracks. However, it does have MIDI clock sync, so you could slave it off another sequencer or hard disk recorder quite easily. Reason can also integrate with other software applications that implement the Propellerhead ReSync standard, such as Cubase SX/VST, Logic and some Cakewalk products.
**This is only meant to be a quick overview of Reason, but by someone who's been using computer-based sequencers since the days when DOS was the only optio
n on the PC. This long association with the technology means I tend to take for granted that the 'standard' way of doing things is the only way, and products like Reason are a breath of fresh air. It's not surprising that Propellerhead received KEYS magazine's 'Most fun product to use of the year' award at the Frankfurt Music Fair this year. See www.propellerheads.se for more information.
**MIDI2WAV
*A question I'm often asked is how to convert a MIDI file into a wave file so that it can
be played on other computers with different - perhaps inferior - MIDI hardware. While MIDI (Music Instrument Digital Interface) playback is a part of the MPC (Multimedia PC) specification that most modern PCs satisfy, the quality of the implementation can vary widely from machine to machine. The upshot of this is that if you develop a MIDI sequence on your PC that sounds pretty good, when you play it back on another PC it might sound like rubbish. This is due to a number of factors such as
the quality of the target MIDI sound generator or even just to the relative loudness of the various General MIDI instruments.
**The way to get round this problem is to create a WAV file, which effectively 'renders' your MIDI sequence into a frozen format. This is a bit like printing out a word processor document: you can no longer edit the data but you know the playback will sound exactly the same on any other PC as it does on yours. If you create a WAV file compatible with the CD Audio
format (sample rate 44.1kHz, 16-bit stereo samples), you can even burn it onto a disc so it could be played on any domestic CD player. Or you could just convert it into an MP3 file and send it out via email. Of course, there's a big data storage size hit when you convert a MIDI file to a digital audio file. For instance, I recently converted a three-and-a-half-minute GM MIDI sequence to an audio file and its size went from 30KB to 36MB. Even subsequently encoding the WAV file as a 64Kb/sec
MP3 file meant the audio file was more than 50 times the size of the original MIDI file.
**While there's nothing difficult about creating a WAV file from a MIDI sequence, it can be fiddly to sort out, as you need to run both your sequencer and an audio recorder, and you may need to connect your sound card's line input to its line output with an external lead. However, David Keppler pointed out a simpler alternative in the 'pc_pro' conference on Cix recently. Called Midi2Wav, this handy l
ittle utility takes control of your PC's mixer and uses Windows built-in MPC facilities to play back a MIDI sequence and record it onto your PC's hard disk at a user-selectable resolution. Its control panel enables you to select the sample format of the destination file as well as mix or even mute individual MIDI channels as you create the WAV file.
**You need to check it works on your particular PC, though, before you shell out the
10 or so for the software. This is because PC manufact
urers have different ideas about how the MPC features should be implemented. For example, my Dell Precision 420, which I use in the office for multimedia development, will only work if I link the sound card's line in to its line out with an external cable, while my DIY studio PC equipped with a Sound Blaster Audigy 1 worked like a dream without a patch. Mind you, the Dell not working was no great loss, as the Audigy's software synthesizer gave better results. Download the demo and purchasU
e from www.midi2wav.com
HCBrian Heywood looks toward the future of computer audio developmentL
March 2003X
101 (March 2003)i
MULTIMEDIA:AUDIO
Steve CassidyD
Basically data
toolingX
tooling
tools
totting
totting
e-bill
touch
story
trace
trackR
track
requests
freeR
traht
transparent
transparent
times
trapsG
travel
travel
trawling
trawling
usenet
treo-hugger
tricks
tripping
trojan
trojan
trouble
trouble
store
troubles
truck
trust
trust
tungsten
tungsten
tuning
tuning
track
turnoff
turnoff
tweaks
palms
twins
twips
steps
typing
vegas
lands
warningd
watchdog`
clipping
services
webcam`
subwindow`
subwoofer
subzone
succeed
succeeded
succeeding
succeedsg
successH
successionW
successivef
successorS
suggestions@
suicide@
suitably@
suites@
sulk@
summing@
sun's@
super-high-resolutio@
superannuated@
superhuman@
superior
superiority
supervisor@
supplied
supplier
supplier's@
supply
supply-induced
support
support's
support-pack
supported@
supporter@
supportive@
supports
supportsymmetrical
suppose
supposedly@
supremo@
sure-fire@
surfacedesigner@
surly@
surprised@
surprising
surprisingly@
surprisingly-advance
surrounding
surroundings
surviving@
suspect's@
suspends
suspiciously@
swallowed
swapped@
swedish
swerve@
taking@
tale@
talk-show@
talking/ebook@
taping@
target@
targeted@
1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I
1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+O+O+O+O+O+O+O+O
%UI1I1I1I1I1I1I1I1I1I
O+O+O+O+O+O+
O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
+O+O+O+O
+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+
O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O
O+O+O+O+O+O+O+O+O+O+O+O+O+O+O+O
O+O+O
frankfurtv
frankl
franklin
franklyC
frantic
franticallyL
fraser
fraternity
fraud
fraud-related
fraudsG
fraudster
fraudsters
fraudulentG
fraudulently
fraudulently-used
fraught
fraunhofer
fraunhofer-thomson
brian
heywood
looks
toward
future
computer
audio
devev
brian
heywood
marches
sound
different
drums
brian
heywood
through
don'ts
producing
brian
heywood
shares
knowledge
stuff
brian
heywood
sheds
light
brian
heywood
shows
compress
sound
files
brian
heywood
takes
sibelius
gives
brian
heywood
takes
latest
digital
offering
brian
guitar
heywood
shows
bricks
bridge
businesses
button
buying
buying
goods
other
personally
create
cards
carefulE
This column is an act of rebellion. Networks? Who needs to talk about networks? They're just a bunch of wires. Let's talk about something really interesting - databases. Almost every piece of data you're likely to want to use is stored in a database, and even the file system of your PC or server could be considered to be a 'database' of a kind. The most frequent problem I've been seeing over the last few months is caused by people throwing good money at the new generation of Pentium 4 Xeon
servers, converting all their stuff to Windows 2000 Server, then looking at their core product or database and not seeing much improvement.
**If I were to look for a reason for the recent decline in IT spending - and an even steeper decline in trust - this notable absence of expected performance gains after laying out great wads of cash on new kit could just be the problem we professionals ought to be addressing. It could, of course, be that I'm seeing a skewed selection of clients with
inadequate databases, and that just around each corner lies a different set of businesses that are all gasping i